google-cloud-datastore

JPA join criteria with datastore in google app engine

大城市里の小女人 提交于 2020-01-15 11:05:31
问题 Suppose I have 2 JPA classes which model 2 entities in datastore (Google app engine) like these: @Entity public class Clazz { @Id @GeneratedValue(strategy = GenerationType.IDENTITY) private Key classKey; @Basic private String classId; @Basic private String className; @ManyToOne private Subject subject; } @Entity public class Subject { @Id @GeneratedValue(strategy = GenerationType.IDENTITY) private Key subjectKey; @Basic private String subjectId; @Basic private String subjectName; @OneToMany

JPA join criteria with datastore in google app engine

笑着哭i 提交于 2020-01-15 11:04:12
问题 Suppose I have 2 JPA classes which model 2 entities in datastore (Google app engine) like these: @Entity public class Clazz { @Id @GeneratedValue(strategy = GenerationType.IDENTITY) private Key classKey; @Basic private String classId; @Basic private String className; @ManyToOne private Subject subject; } @Entity public class Subject { @Id @GeneratedValue(strategy = GenerationType.IDENTITY) private Key subjectKey; @Basic private String subjectId; @Basic private String subjectName; @OneToMany

manual serialization / deserialization of AppEngine Datastore objects

泄露秘密 提交于 2020-01-15 10:17:14
问题 Is it possible to manually define the logic of the serialization used for AppEngine Datastore? I am assuming Google is using reflection to do this in a generic way. This works but proves to be quite slow. I'd be willing to write (and maintain) quite some code to speed up the serialization / deserialization of datastore objects (I have large objects and this consumes quite some percentage of the time). 回答1: The datastore uses Protocol-Buffers internally, and there is no way round, as its the

manual serialization / deserialization of AppEngine Datastore objects

笑着哭i 提交于 2020-01-15 10:17:09
问题 Is it possible to manually define the logic of the serialization used for AppEngine Datastore? I am assuming Google is using reflection to do this in a generic way. This works but proves to be quite slow. I'd be willing to write (and maintain) quite some code to speed up the serialization / deserialization of datastore objects (I have large objects and this consumes quite some percentage of the time). 回答1: The datastore uses Protocol-Buffers internally, and there is no way round, as its the

Dataflow + Datastore = DatastoreException: I/O error

巧了我就是萌 提交于 2020-01-15 09:28:46
问题 I'm trying to write to DataStore from DataFlow using com.google.cloud.datastore . My code looks like this (inspired by the examples in [1]): public void processElement(ProcessContext c) { LocalDatastoreHelper HELPER = LocalDatastoreHelper.create(1.0); Datastore datastore = HELPER.options().toBuilder().namespace("ghijklmnop").build().service(); Key taskKey = datastore.newKeyFactory() .ancestors(PathElement.of("TaskList", "default")) .kind("Task") .newKey("sampleTask"); Entity task = Entity

Google Datastore - What happens when you exceed the one write per second limit?

时光总嘲笑我的痴心妄想 提交于 2020-01-15 09:15:40
问题 I'm trying to create about 100,000 new entities (representing users) that have the same parent. I read that there is a limit of one entity write per second per entity group. I thought the request may time out so I decided to use a Push Queue Task to extend the time I had to ten minutes. I tried using put() in a for loop in a Push Queue Task, but I ended up timing out still (only got to write about 8,900 entities). I'm confused as to why I didn't get an error since I tried to do multiple

updating an entity's property without retrieving the entity from the NDB

我只是一个虾纸丫 提交于 2020-01-15 04:45:07
问题 I would like to update a property of an entity that has a lot of properties. If I understand it correctly, whenever I retrieve the entity from the datastore by entity = key_of_entity.get() to later update it's property entity.some_property += 1 entity.put() I am charged for read of every property of that entity? Since this entity has quite a few properties, such reading over and over again can be quite expensive. Is there any way to update an entity's property without having to do a read on

Updating in datastore not working in GAE 1.9.0

只谈情不闲聊 提交于 2020-01-14 10:42:07
问题 We have a PHP application running on GAE. It connects to Cloud Datastore using the Google PHP library (v0.6.7). Google introduced in the last days a new version of App Engine, v1.9.0 (not oficially released), which apparently was running fine, just as 1.8.9 was. However, we have been experiencing some issues related to Cloud Datastore. Sometimes, all the operations regarding to entities updating are just ignored. All the queries used to retrieve information work perfectly, however if we want

Google App Engine Datastore index cap

假如想象 提交于 2020-01-13 09:21:27
问题 Can somebody explain the 5000 index cap in the Datastore in plain English. Does that mean that an indexed list property of a stored object cannot have more the 5000 elements? 回答1: The Datastore limits the number of index entries that a single entity can have, this limit is set to 5000 elements per entity. You can test this limit easily using the Interactive shell with the following snippet: class Model(db.Model): x = db.ListProperty(int) entity = Model(x = range(5001)) entity.put() 'Too many

what's the practical difference between google datastore nosql and google bigquery sql?

主宰稳场 提交于 2020-01-13 03:04:30
问题 I want to know how to evaluate one tool over another. My major concern is as following: In google datastore, we define 'kind'. Each 'entities' has 'properties'. Then the datastore backends use those properties to index data for future query. The query itself use almost the same idea in SQL, though different syntax, to filter data and find what we want. If you index every property, the index metadata would be even bigger than real data. Google bigquery uses it's dialect of SQL. And it's fully