google-cloud-datastore

Getting entity's id inside a transaction

别等时光非礼了梦想. 提交于 2019-12-11 03:47:46
问题 I have a datastore transaction where I create an entity (a user) letting datastore generate the ID for me. I then would like to use that ID so that I can create another entity (of another kind). While this is possible with regular datastore 'save' api: datastore.save(user).then(() => { userId = user.key.id // returns entity's ID created by datastore }) However, this does not seem possible when using a transaction: transaction.save(user).then(() => { console.log( user.key.id ) }) The above

Workaround of GAE's 30 subqueries limitation

安稳与你 提交于 2019-12-11 03:39:21
问题 I'm writing a news application and I want let my users choose their favourite news sources from a list that contains dozens (~60) of sources(Guardian,Times,...). I have a News entity that contains an indexed property "source" and I'm looking for an approach that will let me bypass the limitation of 30 subqueries imposed by App Engine that prevents me for using the IN and EQUALS filters to get all the news that belongs to a big list of sources. is there any workaround for this limitation?

Google App Engine ndb: find the position of an order query list

江枫思渺然 提交于 2019-12-11 03:38:49
问题 On Google App Engine's ndb, I used the following to retrieve all entities and sort them according to their grade: ranks = Member.query().order(-Member.grade) Then I would like to know the position of a specific member: i = 0 for rank in ranks: if rank.account == 'abc' position = i break i += 1 My question: is there an equivalent ndb operation to find the position of a specific entity? Thanks. 回答1: I believe it could be done in two steps Retrieve the entry whose account is 'abc' target =

datetime.datetime.now() returns old value

若如初见. 提交于 2019-12-11 03:37:46
问题 I'm looking up a datastore entry in python by matching dates. What i want is pick the entry for "today" on each day. But for some reason when i upload my code to the gae server it'll work just for one day and on the next day it still returns the same value. e.g. when i upload my code and execute it on 07-01-2014 it returns the value of 07-01-2014 but the next day on 08-01-2014 it still returns 07-01-2014. If i redeploy the same code and execute it again it moves on to the 08-01-2014 but will

GAE Transaction in root entity

人盡茶涼 提交于 2019-12-11 03:28:25
问题 I'm new to GAE and I have some questions about transaction with the DataStore. For example, I have a user entity, which is created when the user adds my app on Facebook. I get some properties with the Facebook API, but I want to add a username for the user, and it needs to be unique. So in the transaction scope I call this method: def ExistsUsernameToDiferentUser(self, user, username): query = User.all() query.filter("username", username) query.filter("idFacebook != ", user.idFacebook)

Writing to a PDF from inside a GAE app

流过昼夜 提交于 2019-12-11 03:23:11
问题 I need to read several megabytes (raw text strings) out of my GAE Datastore and then write them all to a new PDF file, and then make the PDF file available for the user to download. I am well aware of the sandbox restrictions that prevent you from writing to the file system. I am wondering if there is a crafty way of creating a PDF in-memory (or a combo of memory and the blobstore) and then storing it somehow so that the client-side (browser) can actually pull it down as a file and save it

Google Datastore - Not Seeing 1 Write per Second per Entity Group Limitation

别说谁变了你拦得住时间么 提交于 2019-12-11 03:12:49
问题 I've read a lot about strong vs eventual consistency, using ancestor / entity groups, and the 1 write per second per entity group limitation of Google Datastore. However, in my testing I have never hit the exception Too much contention on these datastore entities. please try again. and am trying to understand whether I'm misunderstanding these concepts or missing a piece of the puzzle. I'm creating entities like so: func usersKey(c appengine.Context) *datastore.Key { return datastore.NewKey(c

geoSpatial & Location based search in google appengine python

独自空忆成欢 提交于 2019-12-11 03:01:16
问题 I want to achieve something like the map drag search on airbnb (https://www.airbnb.com/s/Paris--France?source=ds&page=1&s_tag=PNoY_mlz&allow_override%5B%5D=) I am saving the data like this in datastore user.lat = float(lat) user.lon = float(lon) user.geoLocation = ndb.GeoPt(float(lat),float(lon)) and whenever I drag & drop map or zoom in or zoom out I get following parameters in my controller def get(self): """ This is an ajax function. It gets the place name, north_east, and south_west

Entity Group - deciding on how to group

☆樱花仙子☆ 提交于 2019-12-11 02:54:29
问题 I've read throughout the Internet that the Datastore has a limit of 1 write per second for an Entity Group. Most of what I read indicate a "write to an entity", which I would understand as an update. Does the 1 write per second also apply to adding entities into the group? A simple case would be a Thread where multiple posts can be added by different users. The way I see it, it's logical to have the Thread be the ancestor of the Posts. Thus, forming a wide entity group. If the answer to my

When loading data from Datastore into BigQuery with the command line tool, what determines the inclusion of subfields?

此生再无相见时 提交于 2019-12-11 02:52:12
问题 While using the command line tool to load from Datastore into BigQuery I've noticed the following strange behaviour. When I specify what fields to include using the option projection_fields , there is one rather complex nested field whose subfields are not all included. I can determine no pattern in the selection of subfields. Strangely, if I don't specify projection_fields (i.e. include all fields), all subfields are included. (At least I have to assume so, because one of these subfields is