google-app-engine

manual serialization / deserialization of AppEngine Datastore objects

笑着哭i 提交于 2020-01-15 10:17:09
问题 Is it possible to manually define the logic of the serialization used for AppEngine Datastore? I am assuming Google is using reflection to do this in a generic way. This works but proves to be quite slow. I'd be willing to write (and maintain) quite some code to speed up the serialization / deserialization of datastore objects (I have large objects and this consumes quite some percentage of the time). 回答1: The datastore uses Protocol-Buffers internally, and there is no way round, as its the

Streaming data to Google Cloud Storage from PubSub using Cloud Dataflow

…衆ロ難τιáo~ 提交于 2020-01-15 10:09:07
问题 I am listening to data from pub-sub using streaming data in dataflow. Then I need to upload to storage, process the data and upload it to bigquery. here is my code: public class BotPipline { public static void main(String[] args) { DataflowPipelineOptions options = PipelineOptionsFactory.as(DataflowPipelineOptions.class); options.setRunner(BlockingDataflowPipelineRunner.class); options.setProject(MY_PROJECT); options.setStagingLocation(MY_STAGING_LOCATION); options.setStreaming(true);

Streaming data to Google Cloud Storage from PubSub using Cloud Dataflow

我怕爱的太早我们不能终老 提交于 2020-01-15 10:08:06
问题 I am listening to data from pub-sub using streaming data in dataflow. Then I need to upload to storage, process the data and upload it to bigquery. here is my code: public class BotPipline { public static void main(String[] args) { DataflowPipelineOptions options = PipelineOptionsFactory.as(DataflowPipelineOptions.class); options.setRunner(BlockingDataflowPipelineRunner.class); options.setProject(MY_PROJECT); options.setStagingLocation(MY_STAGING_LOCATION); options.setStreaming(true);

Streaming data to Google Cloud Storage from PubSub using Cloud Dataflow

纵然是瞬间 提交于 2020-01-15 10:06:21
问题 I am listening to data from pub-sub using streaming data in dataflow. Then I need to upload to storage, process the data and upload it to bigquery. here is my code: public class BotPipline { public static void main(String[] args) { DataflowPipelineOptions options = PipelineOptionsFactory.as(DataflowPipelineOptions.class); options.setRunner(BlockingDataflowPipelineRunner.class); options.setProject(MY_PROJECT); options.setStagingLocation(MY_STAGING_LOCATION); options.setStreaming(true);

Google Datastore - What happens when you exceed the one write per second limit?

时光总嘲笑我的痴心妄想 提交于 2020-01-15 09:15:40
问题 I'm trying to create about 100,000 new entities (representing users) that have the same parent. I read that there is a limit of one entity write per second per entity group. I thought the request may time out so I decided to use a Push Queue Task to extend the time I had to ten minutes. I tried using put() in a for loop in a Push Queue Task, but I ended up timing out still (only got to write about 8,900 entities). I'm confused as to why I didn't get an error since I tried to do multiple

Appengine app deployment is not uploading some folders

删除回忆录丶 提交于 2020-01-15 08:44:26
问题 I have an app deployed on AppEngine. When I test the app locally, everything works fine. I have done composer install and the "vendor" folder exists. When I view the source, i can see that some folders are not uploading. This is my folder structure on local drive: I deploy using this code: gcloud app deploy --promote --stop-previous-version app.yaml The deployed structure looks like this: As you can see, only dialpad_research folder is uploaded. My app.yaml file is like this: runtime: php55

How do I query a single field in AppEngine using JDO

江枫思渺然 提交于 2020-01-15 08:30:28
问题 I've got a Product POJO that looks like. @PersistenceCapable(identityType = IdentityType.APPLICATION) public class Product extends AbstractModel { @Persistent private String name; @Persistent private Key homePage; @Persistent private Boolean featured; public String getName() { return name; } public void setName(String name) { this.name = name; } public Key getHomePage() { return homePage; } public void setHomePage(Key homePage) { this.homePage = homePage; } public boolean isFeatured() {

Google App Engine Payload Object

余生颓废 提交于 2020-01-15 08:01:12
问题 How to send a class object in the payload of a task in python? I want to send an object in the parameters of a task. When I use simplejson , I get the error: Object is not serializable . When I use pickle, I get KeyValue Error . How to do this ? This is the class which I want to serialize class Matrix2D_icfg: name = "" indices = [] value = {} def __init__(self,s): self.name = s self.indices = [] def __getitem__(self,i): self.indices.append(i) if len(self.indices)==2: (m,n) = self.indices self

How to find the closest point in the DB using objectify-appengine

别来无恙 提交于 2020-01-15 07:21:01
问题 I'm using objectify-appengine in my app. In the DB I store latitude & longitude of places. at some point I'd like to find the closest place (from the DB) to a specific point. As far as i understood i can't perform regular SQL-like queries. So my question is how can it be done in the best way? 回答1: You should take a look at GeoModel, which enables Geospatial Queries with Google App Engine. Update: Let's assume that you have in your Objectify annotated model class, a GeoPt property called

How to find the closest point in the DB using objectify-appengine

拥有回忆 提交于 2020-01-15 07:20:06
问题 I'm using objectify-appengine in my app. In the DB I store latitude & longitude of places. at some point I'd like to find the closest place (from the DB) to a specific point. As far as i understood i can't perform regular SQL-like queries. So my question is how can it be done in the best way? 回答1: You should take a look at GeoModel, which enables Geospatial Queries with Google App Engine. Update: Let's assume that you have in your Objectify annotated model class, a GeoPt property called