google-cloud-datastore

Google Datastore combine (union) multiple sets of entity results to achieve OR condition

纵然是瞬间 提交于 2019-11-27 04:53:43
问题 I am working with NodeJS on Google App Engine with the Datastore database. Due to the fact that Datastore does not have support the OR operator, I need to run multiple queries and combine the results. I am planning to run multiple queries and then combine the results into a single array of entity objects. I have a single query working already. Question: What is a reasonably efficient way to combine two (or more) sets of entities returned by Datastore including de-duplication? I believe this

Copy an entity in Google App Engine datastore in Python without knowing property names at 'compile' time

我与影子孤独终老i 提交于 2019-11-27 03:34:15
In a Python Google App Engine app I'm writing, I have an entity stored in the datastore that I need to retrieve, make an exact copy of it (with the exception of the key), and then put this entity back in. How should I do this? In particular, are there any caveats or tricks I need to be aware of when doing this so that I get a copy of the sort I expect and not something else. ETA: Well, I tried it out and I did run into problems. I would like to make my copy in such a way that I don't have to know the names of the properties when I write the code. My thinking was to do this: #theThing = a

Fetching a random record from the Google App Engine Datastore?

谁都会走 提交于 2019-11-27 03:09:00
问题 I have a datastore with around 1,000,000 entities in a model. I want to fetch 10 random entities from this. I am not sure how to do this? can someone help? 回答1: Assign each entity a random number and store it in the entity. Then query for ten records whose random number is greater than (or less than) some other random number. This isn't totally random, however, since entities with nearby random numbers will tend to show up together. If you want to beat this, do ten queries based around ten

App Engine BadValueError On Bulk Data Upload - TextProperty being construed as StringProperty

浪尽此生 提交于 2019-11-27 03:05:30
问题 bulkoader.yaml: transformers: - kind: ExampleModel connector: csv property_map: - property: __key__ external_name: key export_transform: transform.key_id_or_name_as_string - property: data external_name: data - property: type external_name: type model.py: class ExampleModel(db.Model): data = db.TextProperty(required=True) type = db.StringProperty(required=True) Everything seems to be fine, yet when I upload I get this error: BadValueError: Property data is 24788 bytes long; it must be 500 or

Google Cloud DataStore automatic indexing

久未见 提交于 2019-11-27 02:28:07
I'm working on a project with already deployed indexing to google cloud datastore. We are missing index.yaml, is it possible to recreate index.yaml automatically? You can view the deployed version of the index.yaml if it was uploaded together with your (standard environment) application code, typically with the default service/module. There is a Diagnose column in the table on the developer console's Versions page and a Tools drop-down menu for each service version row, with a Source option under it: Selecting that option directs you to the StackDriver page for the service, where you can see

GAE ndb design, performance and use of repeated properties

情到浓时终转凉″ 提交于 2019-11-27 01:59:19
Say I have a picture gallery and a picture could potentially have 100k+ fans. Which ndb design is more efficient? class picture(ndb.model): fanIds = ndb.StringProperty(repeated=True) ... [other picture properties] or class picture(ndb.model): ... [other picture properties] class fan(ndb.model): pictureId = StringProperty() fanId = StringProperty() Is there any limit on the number of items you can add to an ndb repeated property and is there any performance hit with storing a large amount of items in a repeated property? If it is less efficient to use repeated properties, what is their intended

How to upload data in bulk to the appengine datastore? Older methods do not work

时光毁灭记忆、已成空白 提交于 2019-11-27 01:53:13
问题 This should be a fairly common requirement, and a simple process: upload data in bulk to the appengine datastore. However, none of the older solutions mentioned on stackoverflow (links below*) seem to work anymore. The bulkloader method, which was the most reasonable solution when uploading to the datastore using the DB API doesn't work with the NDB API And now the bulkloader method seems to have been deprecated and the old links, which are still present in the docs, lead to the wrong page.

Secure Google Cloud Functions http trigger with auth

帅比萌擦擦* 提交于 2019-11-27 01:28:58
问题 I am trying out Google Cloud Functions today following this guide: https://cloud.google.com/functions/docs/quickstart I created a function with an HTTP trigger, and was able to perform a POST request to trigger a function to write to Datastore. I was wondering if there's a way I can secure this HTTP endpoint? Currently it seems that it will accept a request from anywhere/anyone. When googling around, I see most results talk about securing things with Firebase. However, I am not using the

How to browse local Java App Engine datastore?

久未见 提交于 2019-11-26 23:50:25
问题 It seems there is no equivalent of Python App Engine's _ah/admin for the Java implementation of Google App Engine. Is there a manual way I can browse the datastore? Where are the files to be found on my machine? (I am using the App Engine plugin with Eclipse on OS X). 回答1: http://googleappengine.blogspot.com/2009/07/google-app-engine-for-java-sdk-122.html: "At long last, the dev appserver has a data viewer. Start your app locally and point your browser to http://localhost:8888/_ah/admin http:

Best practice to query large number of ndb entities from datastore

↘锁芯ラ 提交于 2019-11-26 23:46:40
问题 I have run into an interesting limit with the App Engine datastore. I am creating a handler to help us analyze some usage data on one of our production servers. To perform the analysis I need to query and summarize 10,000+ entities pulled from the datastore. The calculation isn't hard, it is just a histogram of items that pass a specific filter of the usage samples. The problem I hit is that I can't get the data back from the datastore fast enough to do any processing before hitting the query