google-cloud-datastore

How to connect a GAE app and a GCE app to the same datastore locally?

早过忘川 提交于 2019-12-06 03:23:15
I am running into an issue similar to this one. I have a GAE app and a GCE app that seem to work fine once in the cloud but I am having trouble getting my local environment working so that both of them access the same datastore. I have setup the local datastore as described in the link above except my code looks like this (I had to build it this way in order to get it working in the cloud): print("Connecting to datastore - " + datasetId); // Get credentials form GCE if not local. Credential computeEngineCredential = DatastoreHelper.getComputeEngineCredential(); if(computeEngineCredential !=

multiple filters vs OR , ndb query

不羁的心 提交于 2019-12-06 03:07:26
What is the difference between these queries: With consequent filters: qry1 = Account.query() # Retrieve all Account entitites qry2 = qry1.filter(Account.userid >= 40) # Filter on userid >= 40 qry3 = qry2.filter(Account.userid < 50) # Filter on userid < 50 as well Using ndb.OR: qry = Article.query(ndb.OR(Account.userid >= 40, Account.userid < 50)) Using ndb.AND: qry = Article.query(ndb.AND(Account.userid >= 40, Account.userid < 50)) The first query does an AND. Thus, only the entities that match both inequalities will be returned by the query. The second query does an OR. Thus, entities that

Understanding Cost Estimate for Google Cloud Platform MicroServices Architecture Design

♀尐吖头ヾ 提交于 2019-12-06 02:45:27
I'm redesigning a monolith application into a MicroServices architecture and am hoping to use Google Cloud Platform (GCP) to host the entire solution. I'm having a very hard time understanding their costing breakdown, and am concerned that my costs will be uncontrollable after I build it. This is for a personal project but I'm hoping will have many users after I launch so I want to get the underlying architecture right and at the same time have reasonable costs initially when I launch. Here is my architecture: MicroServices 1 - 4 (Total 4 API Services): Runs on App Engine Exposes a REST API

Google Datastore authentication issue - C#

北战南征 提交于 2019-12-06 02:41:45
I'm trying to connect to the Google Datastore on my account with service account credentials file (which I've created according to the documentation), but I'm encountering with authentication error while trying to insert an entity: Grpc.Core.RpcException: Status(StatusCode=Unauthenticated, Detail="Exception occured in metadata credentials plugin.") My code is: var db = DatastoreDb.Create("myprojectid"); Entity entity = new Entity{ Key = db.CreateKeyFactory("mykindname").CreateIncompleteKey() }; var keys = await db.InsertAsync(new[] { entity }); The GOOGLE_APPLICATION_CREDENTIALS variable

Google App Engine - getting count of records that match criteria over 1000

让人想犯罪 __ 提交于 2019-12-06 02:26:53
问题 I've read in multiple locations that GAE lifted the 1000 record limit on queries and counts, however, I can only seem to get a count of the records up to 1000. I won't be pulling more than 1000 queries at a time, but the requirements are such that I need a count of the matching records. I understand you can use cursors to "paginate" through the dataset, but to cycle through just to get a count seems a bit much. Presumably when they said they "lifted" the limit, it was the hard limit - you

How to update an NDB Model's Schema [closed]

Deadly 提交于 2019-12-06 02:25:31
Closed . This question needs details or clarity . It is not currently accepting answers. Want to improve this question? Add details and clarify the problem by editing this post . Closed 4 years ago . I have seen the solution to this question using App Engine's older DB Datastore API , but cannot find a solution while using the newer NDB API. What is the best way to add migration support, so that I am able to migrate from an old version of a schema to a new version. Would it be best to write a migration script , and how would this work? Something like migrating a schema like this in (Note that

Google App Engine how to track httpsession destroy

老子叫甜甜 提交于 2019-12-06 02:15:37
anybody knows how to track httpsession destroy with GAE? I've found that HttpSessionListener doesn't work properly in GAE and sessionDestroyed method never calls. To be more specific I have an information that I store in database when user logins to the application, but if some user is inactive for some time I need to remove this info from db, that will be easy if sessionDestroyed method will be invoked when such event happens, as for now I did cron job which runs each minute, the job queries all data of this kind handles in memory which data is inactive and removes it. But this is very

App engine NDB: how to access verbose_name of a property

有些话、适合烂在心里 提交于 2019-12-06 02:13:20
问题 suppose I have this code: class A(ndb.Model): prop = ndb.StringProperty(verbose_name="Something") m = A() m.prop = "a string value" Now of course if I print m.prop, it will output "a string value" while in fact it's a StringProperty instance. So verbose_name can't be accessed the "normal" way, i.e m.prop._verbose_name . I read the code and found a way to access it: m._properties["prop"]._verbose_name , it works, but it looks hacky o_o. So tell me, is there another way to do it? Note: I'm

HAS ANCESTOR and HAS DESCENDANT clauses in google cloud datastore

こ雲淡風輕ζ 提交于 2019-12-06 02:10:52
I'm studying the Google Cloud Datastore GQL grammar - specifically the HAS ANCESTOR and HAS DESCENDANT comparison operators. Giving the following Person entities: Amy Fred, parent = Amy Laura, parent = Amy Paul Agnes ... Would the GQL queries below produce the same output? SELECT * FROM Person WHERE key_name='Fred' HAS ANCESTOR KEY('Person', 'Amy') SELECT * FROM Person WHERE KEY('Person', 'Amy') HAS DESCENDANT key_name='Fred' If so, I don't understand the existence of HAS DESCENDANT clause. Thanks in advance! Alfred Fuller These two GQL queries should produce identical results: SELECT * FROM

How do you implement cascading delete in Objectify?

我们两清 提交于 2019-12-06 01:29:54
问题 I have the following heriacy. GrandParent --> Parent --> Child Parent and Child use @Parent Ref<GrandParent> and @Parent Ref<Parent> to create there parent relationship. I am trying to come of with a good way to do a cascading delete for GrandParent . I of course I could load all the children, generate keys from them and delete by key. This seems terribly inefficient. Is there something where I could query by parent and turn the query results into a list of keys without having to do the full