google-cloud-datastore

How do I perform a large query in google app engine?

风流意气都作罢 提交于 2019-12-13 02:39:16
问题 I have collection of apps that total almost 1 million users. I am now adding a push notification system using Google cloud messaging to create alerts. My database contains an entity with the GcmId and app name (ex. "myApp1"). Now, I want to send a GCM message to all users of "myApp1". The objectify documents do not describe the .limit function well though. For example, from the GCM demo app: List<RegistrationRecord> records = ofy().load().type(RegistrationRecord.class).limit(10).list(); will

JDO unique fields in Google App Engine

孤街浪徒 提交于 2019-12-13 02:27:21
问题 According to this, Google App Engine's JDO implementation does not support JDO @Unique annotation. Is this still so? For example, I have this class: @PersistenceCapable public class User { @PrimaryKey @Persistent(valueStrategy = IdGeneratorStrategy.IDENTITY) private Key key; @Persistent private String email; @Persistent private String sessionToken; ... } Obviously the key is unique, but I also wanted to have unique email and sessionToken . If @Unique is not supported, what's the best way to

gcp - get all entities from the datastore

≡放荡痞女 提交于 2019-12-13 02:27:09
问题 I am trying to get all data entities from the datastore. When I came across google docs I found something similar to Query Projection (Link to the Docs). This is the code I used to get all entities from the datastore. def do_the_query_projection(self, kind_name): query = self.client.query(kind=kind_name) query.projection = ['attr_1', 'attr_2', 'attr_3'] #create a list to store f, m, r = [], [], [] for task in query.fetch(): f.append(task['attr_1']) m.append(task['attr_2']) r.append(task['attr

Creating indexes on existing entity properties

你说的曾经没有我的故事 提交于 2019-12-13 02:10:42
问题 When I started off with my project, I thought there was no need to create indexes on certain fields of entities but to generate certain daily reports, statistics we have a need to create indexes on some fields of existing entities. As explained in the post Retroactive indexing in GAE Datastore, only way is to first change these properties from unindexed to indexed then retrieve and write all the entities again. My question is if I take a back up from Datastore Admin and restore after changing

How to structure Google Datastore (App Engine) web traffic model?

瘦欲@ 提交于 2019-12-13 00:51:13
问题 Simple Task: keep track of web traffic (hits) so that I can graph the number of hits per day for the last 30 days. Current Datastore Model (2 fields): 1) Website ID 2) Timestamp of Hit Problem: I'm using Google App Engine's datastore and don't have the ability to do a group-by or count. Can anyone offer a simple way to structure my Google Datastore database to achieve this task? By returning all of the hits and then grouping them in my code seems like a performance hog. Any ideas? 回答1: I

Move data from Google Cloud-SQL to Cloud Datastore

走远了吗. 提交于 2019-12-12 23:20:10
问题 I am trying to move my data from Cloud SQL to Cloud Datastore. There are a bit under 5 million entries in the SQL database. It seems like I can only move over 100,000 entities per day before I get a quota error. I can't figure out which exact quota I'm exceeding, however I have exponential backoff to make sure I'm not sending it too fast. Eventually it hits 5 minutes and the connection to the SQL server dies, but I don't think the writes per second quota is the problem. And I don't see any

Error “Non-repeated field already set.” when loading from Datastore into BigQuery

主宰稳场 提交于 2019-12-12 22:14:03
问题 [EDIT 20160426: This bug appears to have been solved now!] [EDIT 20160219: Updated this question again, to reflect different error messages. See also the bug report I filed.] We have a datastore table that contains a field category , of type Category , which is a custom class. The problem arises when we try to load this table into BigQuery (from a datastore backup). The resulting table should contain (simplified): category.subfield1 ,category.subfield2 ,category.subfield3.subsubfield1

NDB using Users API to form an entity group

你离开我真会死。 提交于 2019-12-12 22:08:40
问题 I'm trying to wrap my head around what seems to be a very simple use case, but I seem to be failing miserably. The goal of the exercise is to look up a set of records for the user that logs in using the Google Accounts username within the high replication datastore and be extremely consistent. My data looks like this: class Account(ndb.Model): owner = ndb.UserProperty() name = ndb.StringProperty() class Content(ndb.Model): content = ndb.StringProperty() When I first create the account, I just

Update one property of an entity in google cloud datastore python

最后都变了- 提交于 2019-12-12 21:25:38
问题 How do I update only one property of an entity in the google cloud datastore, without remove all other properties? key = client.key('employee', ID) employee_to_deactivate = datastore.Entity(key) employee_to_deactivate.update({ 'active':False, }) this updates the active property to False, but removes all the other properties. 回答1: You cannot update specific properties of an entity. All writes (inserts, updates) must include all properties that should be persisted. Whenever you need to do an

Google Data Studio connect to cloud datastore

*爱你&永不变心* 提交于 2019-12-12 19:09:32
问题 Is it possible to connect the cloud datastore to data studio? I just can find the Cloud SQL and BigQuery connector only. 回答1: It's not possible directly. But the quickest workaround is to use this feature https://cloud.google.com/bigquery/loading-data-cloud-datastore where you can load your Datastore backup into BigQuery, and then connect that BigQuery dataset to Data Studio. 来源: https://stackoverflow.com/questions/46110175/google-data-studio-connect-to-cloud-datastore