google-cloud-datastore

Google Developer Console Internal Error with Datastore

与世无争的帅哥 提交于 2019-12-02 04:05:01
问题 I still use the legacy dev console most of the time because whenever I've tried the newer console it has always given an "Internal Error" when trying to view datastore entities or view to Queries section. I was hoping the new, more colorful dev console would finally fix this problem but it gives the same internal error. The only thing I don't get errors for is looking at Datastore Indexes. That seems to work fine. But Dashboard, Query, and Settings all fail to load with the same "Internal

Are datastore indexes same across multiple namespaces?

 ̄綄美尐妖づ 提交于 2019-12-02 04:02:19
问题 I'm working on developing a multi-tenant SaaS system and I'm storing all the data in Datastore. I have a separate namespace for every client, but the same set of "kinds" in all namespaces. This is my question: If I build custom index for one entity kind, will this index be served across all namespaces? or should I specify the namespace also somewhere? This is one of my custom indices: - kind: loginTrack ancestor: no properties: - name: logDate - name: username - name: timeStamp I deployed

Understanding “CancellationException: Task was cancelled” error while doing a Google Datastore query

孤者浪人 提交于 2019-12-02 03:52:10
I'm using Google App Engine v. 1.9.48. During some of my data store queries, I am randomly getting "CancellationException: Task was cancelled" error. And I'm not really sure what exactly is causing this error. From other Stackoverflow posts , I vaguely understand that this has to do with timeouts, but not entirely sure what is causing this. I'm not using any TaskQueues - if that helps. Below is the stack trace: java.util.concurrent.CancellationException: Task was cancelled. at com.google.common.util.concurrent.AbstractFuture.cancellationExceptionWithCause(AbstractFuture.java:1126) at com

Apache Beam Google Datastore ReadFromDatastore entity protobuf

心已入冬 提交于 2019-12-02 02:42:04
问题 I am trying to use apache beam's google datastore api to ReadFromDatastore p = beam.Pipeline(options=options) (p | 'Read from Datastore' >> ReadFromDatastore(gcloud_options.project, query) | 'reformat' >> beam.Map(reformat) | 'Write To Datastore' >> WriteToDatastore(gcloud_options.project)) The object that gets passed to my reformat function is type google.cloud.proto.datastore.v1.entity_pb2.Entity It is in protobuf format which is hard to modify or read. I think I can convert a entity_pb2

Is it guaranteed that numeric auto-incremented ID of new entity always bigger than existing IDs?

亡梦爱人 提交于 2019-12-02 02:26:14
问题 Is it guaranteed that auto-incremented ID of new entity always bigger than existing IDs ? Basically I want to periodically dump entity (e.g. Comment) in background task into big blobs as they get created by customers. So if there are 100 entities right now I'll store them in blob and create helper entity for this blob like class BlobRange { long fromId; // Comment.id long toId; // Comment.id String blobKey; } Next time background task would find biggest BlobRange.toId and would fetch new

Google Developer Console Internal Error with Datastore

ぃ、小莉子 提交于 2019-12-02 01:20:23
I still use the legacy dev console most of the time because whenever I've tried the newer console it has always given an "Internal Error" when trying to view datastore entities or view to Queries section. I was hoping the new, more colorful dev console would finally fix this problem but it gives the same internal error. The only thing I don't get errors for is looking at Datastore Indexes. That seems to work fine. But Dashboard, Query, and Settings all fail to load with the same "Internal Error". Please let me know if there's anything I can do to fix this. I'm very worried that you're going to

Google Datastore Emulator using Java (Not using GAE)

前提是你 提交于 2019-12-02 01:19:39
I am using Google Cloud's Datastore Client Library for Java to access the Cloud Datastore. Note : I am not using App Engine to deploy my application; just running a local application for development purposes. Following the example, I can read/write to the Cloud Datastore. Datastore datastore = DatastoreOptions.defaultInstance().service(); KeyFactory keyFactory = datastore.newKeyFactory().setKind("MyKind"); Key key = keyFactory.newKey(); Entity entity = datastore.get(key); I want to be able to write to a local Datastore emulator instance instead. Following the guide here , I run gcloud beta

In Google App Engine, how to check input validity of Key created by urlsafe?

爱⌒轻易说出口 提交于 2019-12-02 01:08:16
问题 Suppose I create a key from user input websafe url key = ndb.Key(urlsafe=some_user_input) How can I check if the some_user_input is valid? My current experiment shows that statement above will throw ProtocolBufferDecodeError (Unable to merge from string.) exception if the some_user_input is invalid, but could not find anything about this from the API. Could someone kindly confirm this, and point me some better way for user input validity checking instead of catching the exception? Thanks a

Apache Beam Google Datastore ReadFromDatastore entity protobuf

本小妞迷上赌 提交于 2019-12-02 00:13:21
I am trying to use apache beam's google datastore api to ReadFromDatastore p = beam.Pipeline(options=options) (p | 'Read from Datastore' >> ReadFromDatastore(gcloud_options.project, query) | 'reformat' >> beam.Map(reformat) | 'Write To Datastore' >> WriteToDatastore(gcloud_options.project)) The object that gets passed to my reformat function is type google.cloud.proto.datastore.v1.entity_pb2.Entity It is in protobuf format which is hard to modify or read. I think I can convert a entity_pb2.Entity to a dict with entity= dict(google.cloud.datastore.helpers._property_tuples(entity_pb)) But for

Datastore fetch VS fetch(keys_only=True) then get_multi

邮差的信 提交于 2019-12-01 23:20:27
I am fetching multiple entities 100+ from datastore using the below Query return entity .query(ancestor = ancestorKey ).filter( entity .year= myStartYear ).order( entity .num).fetch() Which was taking a long time ( order of a few seconds ) to load. Trying to find an optimum way, I created exactly 100 entities, found that it takes anywhere between 750ms ~ 1000ms to fetch the 100 entities on local server, which is a lot of course. I am not sure how to get around a single line fetch to make it more efficient! In a desperate attempt to optimize, I tried Removing the order part, still got the same