google-cloud-datastore

App Engine: Few big scripts or many small ones?

♀尐吖头ヾ 提交于 2019-11-28 14:34:35
I am working on a website that I want to host on App Engine. My App Engine scripts are written in Python. Now let's say you could register on my website and have a user profile. Now, the user Profile is kind of extensive and has more than 50 different ndb properties (just for the sake of an example). If the user wants to edit his records (which he can to some extend) he may do so through my website send a request to the app engine backend. The way the profile is section, often about 5 to 10 properties fall into a small subsection or container of the page. On the server side I would have

Datastore: Multiple writes against an entity group inside a transaction exceeds write limit?

三世轮回 提交于 2019-11-28 14:20:49
I'm familiar with Datastore's single-write-per-second limit (ok, 5, maybe) for entity groups. How does that square with transactions? The docs seem to indicate that I can do multiple modifications inside a transaction- for example adding several descendant entities: A single transaction can modify multiple entities in a single group, or add new entities to the group by making the new entity's parent an existing entity in the group. https://cloud.google.com/appengine/docs/java/datastore/transactions Dan Cornilescu Yes, you can do multiple write operations per entity group inside the same

Best option for Google App Engine Datastore and external database?

ε祈祈猫儿з 提交于 2019-11-28 14:08:05
I need to get an App Engine app talking to and sharing data with an external database, The best option i can come up with is outputting the external database data to an xml file and then processing this in my app engine app and storing it inside the datastore, although the data being shared is sensitive data such as login details so outputting this to an xml file is not exactly a great idea, is it possible for the app engine app to directly query the database? or is there a secure option for using xml files? oh and im using python/django and the external database will be hosted on another

Any issues using multiple GAE app versions to get multiple apps to share the same datastore?

我们两清 提交于 2019-11-28 14:06:44
According to the research I've done (see for example this gae issue and this stack overflow question ), it is not possible to share one datastore across two applications, and most folks recommend using either the RemoteAPI or using multiple "versions" of the same application, where each version is really an entirely different application. According to GoogleAppEngine Issue 1300 , allowing multiple GAE applications to share the same datastore has been "accepted" which presumably means that this feature may be officially supported some day. I'm hesitant to use the RemoteAPI because I suspect

Doing a “IN Array” query on google app engine datastore with golang

喜夏-厌秋 提交于 2019-11-28 13:43:58
Is there a way to do a query with ids []int64 on datastore? I've tried the following with no avail. Errors out q := datastore.NewQuery("Category").Filter("Id IN", ids) Just gets me all the the categories in the datastore for _, id := range ids { q.Filter("Id =", id) } After icza's answer var keys []*datastore.Key for _, id := range ids { keys = append(keys, datastore.NewKey(c, "Category", "", id, nil)) } categories := make([]Category, len(keys)) err := datastore.GetMulti(c, keys, categories) if err != nil { return nil, err } Generally "IN" filters are not supported by the Datastore. The

What database does Google use?

牧云@^-^@ 提交于 2019-11-28 13:06:53
问题 Is it Oracle or MySQL or something they have built themselves? 回答1: Bigtable A Distributed Storage System for Structured Data Bigtable is a distributed storage system (built by Google) for managing structured data that is designed to scale to a very large size: petabytes of data across thousands of commodity servers. Many projects at Google store data in Bigtable, including web indexing, Google Earth, and Google Finance. These applications place very different demands on Bigtable, both in

Using gcloud-python in GAE

空扰寡人 提交于 2019-11-28 12:36:04
I've got a bunch of little Raspberry Pis running some python code which saves directly to the Datastore (skips GAE) using the gcloud-python datastore package. This works great. I now want to present the data via web and mobile clients using Google App Engine. On my MacBook I installed GAE using the installer and gcloud via pip. I can write a simple python script and execute it directly from the terminal which is able to write and read from the datastore via gcloud - that also works just fine. However, when I try to incorporate that same code into GAE it fails. Based on my research, I expect

Google App Engine Datastore - Testing Queries fails

不打扰是莪最后的温柔 提交于 2019-11-28 12:33:19
I am currently trying to test a piece of my code that runs a query on the datastore before putting in a new entity to ensure that duplicates are not created. The code I wrote works fine in the context of the app, but the tests I wrote for that methods are failing. It seems that I cannot access data put into the datastore through queries in the context of the testing package. One possibility might lie in the output from goapp test which reads: Applying all pending transactions and saving the datastore . This line prints out after both the get and put methods are called (I verified this with log

What does Google classify as a datastore write operation in Google App Engine?

倾然丶 夕夏残阳落幕 提交于 2019-11-28 11:14:31
Since GAE went to the pricing model at the start of last week I have been wrestling with exceeding my quota of Datastore read and write operations. I'm not sure whether Google counts all updates for one writer as one write or whether every column update is counted as a separate write. If the latter is true could I get around this by having one update function to update the 6 columns in the parameters or do will I also get charged for 6 updates? Here is my existing code, used to update a player's score (rating) and the other details at the same time. At the moment I always populate name, email,

Importing Entities Into Local GCP Datastore Emulator

孤人 提交于 2019-11-28 10:27:36
问题 I was able to export entities into a storage bucket without much difficulty with this command: gcloud datastore export --kinds="KIND1,KIND2" --namespaces="NAMESPACE1,NAMESPACE2" gs://${BUCKET} And according to the docs importing can be done like this: gcloud datastore import gs://${BUCKET}/[PATH]/[FILE].overall_export_metadata or like this: curl \ -H "Authorization: Bearer $(gcloud auth print-access-token)" \ -H "Content-Type: application/json" \ https://datastore.googleapis.com/v1/projects/$