google-cloud-datastore

CloudDataflow can not use “google.cloud.datastore” package?

依然范特西╮ 提交于 2019-12-23 12:22:04
问题 I want to put datastore with transaction on CloudDataflow. So, I wrote below. def exe_dataflow(): .... from google.cloud import datastore # call from pipeline def ds_test(content): datastore_client = datastore.Client() kind = 'test_out' name = 'change' task_key = datastore_client.key(kind, name) for _ in range(3): with datastore_client.transaction(): current_value = client.get(task_key) current_value['v'] += content['v'] datastore_client.put(task) # pipeline .... | 'datastore test' >> beam

Exporting Google App Engine Datastore to MySQL?

情到浓时终转凉″ 提交于 2019-12-23 12:03:43
问题 We're thinking of building some of our infrastructure on to Google App Engine. But we're worried that if it does not scale, we'll need to export the data and run it on our own servers in future. Is there a way to export from App Engine Datastore to MySQL? 回答1: As far as data export goes, the Bulk Downloader exists for just this purpose. By default it exports to CSV files, but you can write a custom Exporter class that exports directly to a MySQL database, or any other format of your choosing.

Where is the gae local datastore on a mac OSX 10.8.3 filesystem?

梦想与她 提交于 2019-12-23 07:40:58
问题 I have tried all the suggestions from these posts: Does anyone know where the Google App Engine local datastore file located for Mac OS X Where is my local App Engine datastore? Google App Engine local datastore path configuration and I still can not find my local datastore. I don't have the SDK set to clear the datastore on startup plus I can't find it even when the appserver is running. Does anyone know the file path for the Google App Engine's development server datastore file? I am

Using ancestors or reference properties in Google App Engine?

有些话、适合烂在心里 提交于 2019-12-23 07:12:31
问题 Currently, a lot of my code makes extensive use of ancestors to put and fetch objects. However, I'm looking to change some stuff around. I initially thought that ancestors helped make querying faster if you knew who the ancestor of the entity you're looking for was. But I think it turns out that ancestors are mostly useful for transaction support. I don't make use of transactions, so I'm wondering if ancestors are more of a burden on the system here than a help. What I have is a User entity,

Unable to get saved entity using objectify

一世执手 提交于 2019-12-23 06:41:35
问题 I am unable to get my saved entity reliably using Objectify. It looks like the cache is getting corrupted. The strange thing is - I can see the saved entity correctly though admin console datastore viewer. Also I wrote a small program to view the entity using RemoteApi and I can see the saved value correctly. When I query the entity successively using a servlet or a cloud endpoint rest api - my successive queries are giving different results, and it looks like something in the datastore/cache

Index Builder for Fast Retrieval similar Multiple table retrieval in Single Query in App Engine

泄露秘密 提交于 2019-12-23 05:57:05
问题 In Google App Engine Datastore HRD in Java, We can't do joins and query multiple table using Query object or GQL directly I just want to know that my idea is correct approach or not If We build Index in Hierarchical Order Like Parent - Child - Grand child by node Node - Key - IndexedProperty - Set In case if we want to collect all the sub child's & grand child's. We can collect all the keys which are matching within the hierarchy filter condition and provide the result of keys and In Memcache

Does appengine have a new limitation on “Exceeded maximum allocated IDs”?

廉价感情. 提交于 2019-12-23 05:15:35
问题 My app is running fine for months. Today it starts to give error on "Exceeded maximum allocated IDs" with datastore put. Is this something new with appengine quota and limitation? http://gochild2009.appspot.com 回答1: While we're unsure how you ended up with such large ids, we are taking steps to accommodate them. You can track this on https://code.google.com/p/googleappengine/issues/detail?id=9118. Thanks. 回答2: Our app had the same problem. There is not such a limit (quota). It seems that at

How much quota does an appengine datastore Query cost?

坚强是说给别人听的谎言 提交于 2019-12-23 03:19:15
问题 In the appengine Billing and Budgeting Resources page, it says that the cost of a "Query" maps to "1 read + 1 small per entity retrieved", whereas a "Query (keys only)" maps to "1 read + 1 small per key retrieved". This seems like a typo to me. It would seem that a Query would still need to perform a full "get" operation on each entity returned. Is this assumption incorrect? I would have expected the cost of a "Query" to be "1 read + 1 read per entity retrieved". 回答1: This definitely looks

GAE python NDB projection query working in development but not in production

我怕爱的太早我们不能终老 提交于 2019-12-23 02:29:18
问题 I've been hitting my head against the wall because my Google App Engine python project has a very simple NDB projection query which works fine on my local machine, but mysteriously fails when deployed to production. Adding to the mystery... as a test I added an identical projection on another property, and it works in both dev and production! Could anyone help please?! Here are more details: I have the following entity that represents an expense: class Entry(ndb.Model): datetime = ndb

App Engine: Copy live Datastore to local dev Datastore (that still works)

牧云@^-^@ 提交于 2019-12-23 01:04:46
问题 This used to be possible by downloading with the bulkloader and uploading to the local dev server. However, the bulkloader download has been non-functional for several months now, due to not supporting oauth2. A few places recommend downloading from a cloud storage backup, and uploading to the local datastore through either bulkloader or by directly parsing the backup. However, neither of these appear functional anymore. The bulkloader method throws: OperationalError: unable to open database