google-cloud-datastore

Objectify error “You cannot create a Key for an object with a null @Id” in JUnit

坚强是说给别人听的谎言 提交于 2019-12-01 10:35:19
问题 I got the following error while testing a simple piece of code in JUnit that creates a User object (an Objectify Entity) and then tries to attach it as a Parent to another Objectify Entity called DownloadTask : java.lang.IllegalArgumentException: You cannot create a Key for an object with a null @Id. Object was com.netbase.followerdownloader.model.User@57fcbecc at com.googlecode.objectify.impl.KeyMetadata.getRawKey(KeyMetadata.java:185) at com.googlecode.objectify.impl.Keys.rawKeyOf(Keys.java

GAE datastore query with filter and sort using objectify

走远了吗. 提交于 2019-12-01 09:29:11
问题 I am trying to query the datastore for the top 100 users in terms of points scored, who have logged on in the past week (the date field). List<User> users = ofy().load().type(User.class) .filter("date >", date).order("date") .order("-points").limit(100).list(); It seems to ignore the final ordering by points and returns the list sorted by date instead. If I remove the date filter and sort then I get list nicely sorted by points, but including users who have logged on more than a week ago. I

How to filter rows with null refrences in Google app engine DB

戏子无情 提交于 2019-12-01 09:20:40
I have a Model UnitPattern, which reference another Model UnitPatternSet e.g. class UnitPattern(db.Model): unit_pattern_set = db.ReferenceProperty(UnitPatternSet) in my view I want to display all UnitPatterns having unit_pattern_set refrences as None, but query UnitPattern.all().filter("unit_pattern_set =", None) returns nothing, though I have total 5 UnitPatterns, out of which 2 have 'unit_pattern_set' set and 3 doesn't have e.g. print 'Total',UnitPattern.all().count() print 'ref set',UnitPattern.all().filter("unit_pattern_set !=", None).count() print 'ref not set',UnitPattern.all().filter(

How do I access my AppEngine DataStore entities from my Compute Engine VM?

蹲街弑〆低调 提交于 2019-12-01 09:14:32
My app is running on App Engine, but I would like to access its NDB DataStore entities from my Compute Engine VM to do some processing and write the results back to the App Engine DataStore. How can I do that? Also, are the Google Cloud DataStore and App Engine DataStore the same thing? https://developers.google.com/datastore/ https://developers.google.com/appengine/docs/python/ndb/ Dmytro Sadovnychyi David's solution requires you to use App Engine instance time to make requests, but you can bypass it and make requests directly to Datastore from Compute Engine instance. There is a pretty good

Set a kind name independently of the model name (App Engine datastore)

心已入冬 提交于 2019-12-01 09:00:36
As a Python programmer, I like my code to be reusable, I'm trying to avoid kind name conflicts in my code (where two different models share the same kind name). Currently I just prepend some meaningful text to the model's class name, but this is awfully unpythonic. Being able to explicitly set the model's kind will solve my problem, but I can't find out how to do this, does anyone know how? Just override the kind() method of your class: class MyModel(db.Model): @classmethod def kind(cls): return 'prefix_%s' % super(MyModel, cls).kind() You can define a custom baseclass that does this for you:

TransactionFailedError (too much contention…) when reading (cross-group) entities from datastore

浪尽此生 提交于 2019-12-01 08:53:29
问题 I’m investigating again the unexpected occurrence of TransactionFailedError (too much contention on these datastore entities... in cases, where the code only reads entity groups that are blamed for the contention problems. Setup GAE standard environment, Python 2.7 with NDB (SDK 1.9.51). I managed to observe the error in an isolated app (only me as user) where the same request handler is executed in a task queue and read/write access to the entity-groups mentioned below is only done by this

Copying data from GAE to local data storage fails

随声附和 提交于 2019-12-01 08:23:36
I have followed all the instructions: 1) I downloaded it like this: appcfg.py download_data -A s~myApp --url=https://myApp.appspot.com/_ah/remote_api/ --filename=data.csv Note that according to this solution I have to append s~ to the app name, or I get the error message: google.appengine.api.datastore_errors.BadRequestError: app s~myApp cannot access app myApp's data 2) I have to add remote_api access to my app.yaml - url: /remote_api script: google.appengine.ext.remote_api.handler.application login: admin 3) I have to run the local server and go to http://localhost:8080/remote_api . In there

How to upload multiple files to BlobStore?

£可爱£侵袭症+ 提交于 2019-12-01 08:15:23
I'm trying to upload multiple files in a form to the BlobStore. Form: <form action="{{upload_url}}" method="POST" enctype="multipart/form-data"> <label>Key Name</label><input type="text" name="key_name" size="50"><br/> <label>name</label><input type="text" name="name" size="50"><br/> <label>image</label><input type="file" name="image" size="50"><br/> <label>thumb</label><input type="file" name="thumb" size="50"><br/> <input type="submit" name="submit" value="Submit"> </form> I'm then trying to fetch the BlobInfo objects for each of those files uploaded: def post(self): image_upload_files =

Google App Engine - About how much quota does a single datastore put use?

99封情书 提交于 2019-12-01 08:03:09
问题 The latency for a datastore put is about 150ms - http://code.google.com/status/appengine/detail/datastore/2010/03/11#ae-trust-detail-datastore-put-latency. About how much CPUTime is used by a single datastore put with data size of 100 bytes, into an entity that has only 1 property, and no indexes? Also, does anyone know about how much extra overhead in CPUTime doing this datastore put through the task queue would be? I plan to do some testing with this later today to figure it out, but if

How to best handle data stored in different locations in Google BigQuery?

心不动则不痛 提交于 2019-12-01 07:31:16
My current workflow in BigQuery is as follows: (1) query data in a public repository (stored in the US), (2) write it to a table in my repository, (3) export a csv to a cloud bucket and (4) download the csv on the server I work on and (5) work with that on the server. The problem I have now, is that the server I work on is located in EU. Thus, I have to pay quite some fees for transfering data between my US bucket and my EU server. I could now go ahead and locate my bucket in EU, but then I still have the problem that I would transfer data from the US (BigQuery) to EU (bucket). So I could also