google-cloud-datastore

Google App Engine Versioning in the Datastore

戏子无情 提交于 2019-12-03 05:58:26
问题 Google App Engine has the concept of app versions. i.e., you can have multiple versions of your app running concurrently and accessible at different subdomains. For instance: http://1.my-app-name.appspot.com , http://2.my-app-name.appspot.com . What aspects of the app are actually "versioned" by this? Is it only the Python + Static files codebase? Does the datastore have the concept of "versions"? If not, then what happens when I update the definition of a Google App Engine model? Thanks! 回答1

Fetching just the Key/id from a ReferenceProperty in App Engine

两盒软妹~` 提交于 2019-12-03 05:57:20
问题 I could use a little help in AppEngine land... Using the [Python] API I create relationships like this example from the docs: class Author(db.Model): name = db.StringProperty() class Story(db.Model): author = db.ReferenceProperty(Author) story = db.get(story_key) author_name = story.author.name As I understand it, that example will make two datastore queries. One to fetch the Story and then one to deference the Author inorder to access the name. But I want to be able to fetch the id, so do

What is the purpose of ancestors in the google app engine datastore?

删除回忆录丶 提交于 2019-12-03 05:49:48
I have been playing around with google app engine and its datastore recently and created a datamodel and relationships using reference properties. However I am unclear about the concept of ancestors wrt the datastore. What is their purpose and why should I use them? How do they relate to reference properties of datastore entities? Another benefit of entity groups/ancestors is to create islands of strong consistency (as opposed to eventual consistency). For instance, you could have a project and its tasks. Without ancestors, you could 'close' a task, go back to the tasks list screen for the

Appengine filter inequality and ordering fails

a 夏天 提交于 2019-12-03 05:43:44
问题 I think I'm overlooking something simple here, I can't imagine this is impossible to do. I want to filter by a datetime attribute and then order the result by a ranking integer attribute. When I try to do this: query.filter("submitted >=" thisweek).order("ranking") I get the following: BadArgumentError: First ordering property must be the same as inequality filter property, if specified for this query; received ranking, expected submitted Huh? What am I missing? Thanks. 回答1: The datastore isn

How to sort responses in Objectify?

醉酒当歌 提交于 2019-12-03 05:40:59
I'm currently building an app for deployment to GAE, using Objectify 3.1. I am getting strange results when attempting to do a query with an order() clause. My domain: public class InvoiceLineItem { private int units; private BigDecimal unitCost; private BigDecimal extendedCost; private String description; @Parent Key<Invoice> invoice; } I am attempting to gather all of the InvoiceLineItems associated with a given Invoice using the following: ofy ().query (InvoiceLineItem.class).ancestor (invoiceKey).list ( ); In my test case, this works just fine, returning 2 rows as expected. However, when I

Schema-less design guidelines for Google App Engine Datastore and other NoSQL DBs

半世苍凉 提交于 2019-12-03 02:47:43
Coming from a relational database background, as I'm sure many others are, I'm looking for some solid guidelines for setting up / designing my datastore on Google App Engine. Are there any good rules of thumb people have for setting up these kinds of schema-less data stores? I understand some of the basics such as denormalizing since you can't do joins, but I was wondering what other recommendations people had. The particular simple example I am working with concerns storing searches and their results. For example I have the following two models defined in my Google App Engine app using Python

What is the correct way to get the previous page of results given an NDB cursor?

℡╲_俬逩灬. 提交于 2019-12-03 02:04:32
I'm working on providing an API via GAE that will allow users to page forwards and backwards through a set of entities. I've reviewed the section about cursors on the NDB Queries documentation page , which includes some sample code that describes how to page backwards through query results, but it doesn't seem to be working as desired. I'm using GAE Development SDK 1.8.8. Here's a modified version of that example that creates 5 sample entities, gets and prints the first page, steps forward into and prints the second page, and attempts to step backwards and print the first page again: import

How do I query in GQL using the entity key

倖福魔咒の 提交于 2019-12-03 01:50:37
问题 How do I write a query against the entity key using GQL in the Google App Engine Data Viewer ? In the viewer, the first column (Id/Name) displays as name=_1 , in the detail view it shows the key as Decoded entity key: Programme: name=_1 Entity key: agtzcG9................... This query does not work: SELECT * FROM Programme where name = '_1' 回答1: You can use the entity's key to retrieve it: SELECT * FROM Programme where __key__ = KEY('agtzcG9...................') And, you should be able to

How do dynamic backends start in Google App Engine

[亡魂溺海] 提交于 2019-12-03 01:13:21
Can we start a dynamic backend programatically? mean while when a backend is starting how can i handle the request by falling back on the application(i mean app.appspot.com). When i stop a backend manually in admin console, and send a request to it, its not starting "dynamically" Dynamic backends come into existence when they receive a request, and are turned down when idle; they are ideal for work that is intermittent or driven by user activity. Resident backends run continuously, allowing you to rely on the state of their memory over time and perform complex initialization. http://code

Improve throughput of ndb query over large data

大兔子大兔子 提交于 2019-12-03 00:45:29
I am trying to perform some data processing in a GAE application over data that is stored in the Datastore. The bottleneck point is the throughput in which the query returns entities and I wonder how to improve the query's performance. What I do in general: everything works in a task queue, so we have plenty of time (10 minute deadline). I run a query over the ndb entities in order to select which entities need to be processed. as the query returns results, I group entities in batches of, say, 1000 and send them to another task queue for further processing. the stored data is going to be large