google-cloud-pubsub

Delay message processing and delete before processing

偶尔善良 提交于 2019-12-04 01:40:52
I need this ability to send push notifications for an action in a mobile app but wait for the user to undo the action until say 10 seconds. Is it possible to delay the processing of a message published in a topic by 10 seconds ? And then (sometimes, if user does undo) delete the message before 10 seconds, if it doesn't need to be processed ? Depends on if you write the subscribers as well or not: You have control over your subscribers code: In your Pubsub messages add a timestamp for when you want that message to be processed. In your clients(subscribers), have logic to acknowledge the message

Google Dataflow: running dynamic query with BigQuery+Pub/Sub in Python

一世执手 提交于 2019-12-03 21:51:46
What I would like to do in the pipeline: Read from pub/sub (done) Transform this data to dictionary (done) Take the value of a specified key from the dict (done) Run a parametrized/dynamic query from BigQuery in which the where part should be like this: SELECT field1 FROM Table where field2 = @valueFromP/S The pipeline | 'Read from PubSub' >> beam.io.ReadFromPubSub(subscription='') | 'String to dictionary' >> beam.Map(lambda s:data_ingestion.parse_method(s)) | 'BigQuery' >> <Here is where I'm not sure how to do it> The normal way to read from BQ it would be like: | 'Read' >> beam.io.Read(beam

Authenticating PubSub Push messages in AppEngine

∥☆過路亽.° 提交于 2019-12-03 21:32:40
Is there a way to know for sure that a message received by app engine is from the Google PubSub service? Currently the PubSub service gets a 302 on the URLs configured as "login: admin" in appengine app.yaml. So it keeps retrying. I would have expected this to behave like the Tasks in Appengine and automatically authenticate to "login:admin" URLs. The FAQ recommends that when setting up your PubSub push subscription you put a shared secret token as a request parameter which you check for in your handler. If you additionally would like to verify that the messages originated from Google Cloud

How to create a Dataflow pipeline from Pub/Sub to GCS in Python

谁说胖子不能爱 提交于 2019-12-03 18:08:06
问题 I want to use Dataflow to move data from Pub/Sub to GCS. So basically I want Dataflow to accumulate some messages in a fixed amount of time (15 minutes for example), then write those data as text file into GCS when that amount of time has passed. My final goal is to create a custom pipeline, so "Pub/Sub to Cloud Storage" template is not enough for me, and I don't know about Java at all, which made me start to tweak in Python. Here is what I have got as of now (Apache Beam Python SDK 2.10.0):

Unable to change location for firebase pubsub trigger

◇◆丶佛笑我妖孽 提交于 2019-12-02 17:38:21
问题 I have an App Engine app in asia-northeast1, but i could not specify the region for pubsub trigger. This works: functions.region("asia-northeast1").https.onRequest(async (req, res) This does not work: functions.region("asia-northeast1").pubsub.schedule('* 6-23 * * *') With the following erroor message: Error: HTTP Error: 400, Location must equal asia-northeast1 because the App Engine app that is associated with this project is located in asia-northeast1 Does anyone have any experience getting

Unable to change location for firebase pubsub trigger

被刻印的时光 ゝ 提交于 2019-12-02 10:26:07
I have an App Engine app in asia-northeast1, but i could not specify the region for pubsub trigger. This works: functions.region("asia-northeast1").https.onRequest(async (req, res) This does not work: functions.region("asia-northeast1").pubsub.schedule('* 6-23 * * *') With the following erroor message: Error: HTTP Error: 400, Location must equal asia-northeast1 because the App Engine app that is associated with this project is located in asia-northeast1 Does anyone have any experience getting pubsub trigger to work in a different region than the default us-central1? Thank you in advance This

How to get the response delivered to Subscriber back to the Producer

冷暖自知 提交于 2019-12-02 09:52:17
I have implemented a model using google pubsub where the producer sends in the message and the subscriber processes the message and sends the response to the subscription. But how do I map the response to the publisher which sent the request? Are there any filters that can be put on the subscription so that the response can be tracked? or is there another way of implementing this? There is no way in Cloud Pub/Sub for the publisher to know that the subscriber processed the message. One of the main goals with the pub/sub paradigm is to separate the publisher from the subscriber and having this

How scalable are Google Club Pub/Sub compared Object Change Notifications

故事扮演 提交于 2019-12-02 08:44:31
As the title asks, how scalable is Google Club Pub/Sub compared to Object Change Notifications in Google Cloud Storage when using Signed URLs to upload objects? How do each compare in terms of being able to handle many objects being uploaded in a short period of time? Will delivery be slower if many objects are uploaded quickly? For example, 1000 objects/second? If neither are scalable what other options are there? For my purposes, I need to upload an image, then when a notification is delivered to my Google App Engine app, I need to make a write to my database. It is essential that the period

Google Cloud Pub/Sub Push Messages - Empty POST

一个人想着一个人 提交于 2019-12-02 07:42:22
I currently have successfully set up a topic and subscription in the google cloud platform, and have verified my site with google and added the domain to the GCP. Whenever I try to send a test message from https://console.cloud.google.com/cloudpubsub/topics/subscription_sync , the endpoint I configured receives something , but the POST variable is empty. Here is the code I have so far in php, it just does simple logging of the POST variable (which later shows up in my logs as empty.) require_once 'EventPersister.class.php'; $eventPersister = new EventPersister(EventPersister::GOOGLE_WEBHOOKS);

Listing the Pub/Sub subscription consumers

天大地大妈咪最大 提交于 2019-12-02 06:50:37
It is possible for a service to do long polling on a pub/sub subscription. That obviously requires a TCP connection to be constantly open between the pub/sub service and the client. Is there any way to find out if a certain subscription has that TCP connection open? There is no way to list all consumers for a Pub/Sub subscription, no. This would be tough as there are three different types of subscribers: push , pull , and streaming pull . Only the last one maintains an open connection to the server. 来源: https://stackoverflow.com/questions/50682481/listing-the-pub-sub-subscription-consumers