google-cloud-pubsub

generic:3 - Push into an endpoint URL (pubsub api for gmail)

ぐ巨炮叔叔 提交于 2019-12-11 01:23:13
问题 I am working with pub/sub for the first time and its quite confusing. I just want to receive push notifications on my MVC application whenever I receive an email on gmail account. I have setup the project id (enabled pub/sub API), created a topic with permissions (gmail-api-push@system.gserviceaccount.com) and added a subscriber to that topic, everything from console.cloud.google.com as I don't think I need to setup these from my code everytime. I am trying to set the delivery type to 'Push

How to fix AttributeError: 'module' object has no attribute 'Client' when running python in Google Cloud Interactive Shell

你说的曾经没有我的故事 提交于 2019-12-10 19:40:03
问题 I'm trying to run a python script that simulates traffic sensors sending in data in real time to PubSub on my Google Cloud Shell. I'm getting this error Traceback (most recent call last): File "./send_sensor_data.py", line 87, in <module> psclient = pubsub.Client() AttributeError: 'module' object has no attribute 'Client' Tried running google.cloud.pubsub.__file__ , no duplicates exist. I've been searching everywhere and the popular consensus was to install the pubsub package into a virtual

Keep getting 'Error sending test message to Cloud PubSub…' with Google Cloud PubSub

試著忘記壹切 提交于 2019-12-10 19:10:36
问题 I'm trying to set up Google's push PubSub to my server to receive Gmail push notifications. I'm getting the following scopes: https://mail.google.com/ https://www.googleapis.com/auth/cloud-platform https://www.googleapis.com/auth/pubsub https://www.googleapis.com/auth/gmail.modify https://www.googleapis.com/auth/gmail.readonly It works to create a topic, subscribe to that topic, grant access to the Gmail API on that topic but it fails when I'm trying to watch my inbox. I have followed this

Google Cloud Pub/Sub Retry Count

帅比萌擦擦* 提交于 2019-12-10 15:31:50
问题 We are moving from an unstable messaging queue service to Google's Pub Sub in NodeJS. It seems to work well but we would like to include error handling. We would like to limit the number of retries for a particular message, say 10 times in our test environment and 100 times in production. Now if a message fails 10 times (in test), instead of it sitting in our queue and continue to be processed and fail for 7 days we would like to move it to a separate error queue and send us an email. We

How to stop a streaming pipeline in google cloud dataflow

非 Y 不嫁゛ 提交于 2019-12-10 15:16:51
问题 I have a Streaming dataflow running to read the PUB/SUB subscription. After a period of a time or may be after processing certain amount of data, i want the pipeline to stop by itself. I don't want my compute engine instance to be running indefinitely. When i cancel the job through dataflow console, it is shown as failed job. Is there a way to achieve this? am i missing something ? Or that feature is missing in the API. 回答1: Could you do something like this? Pipeline pipeline = ...; ...

Invalid push endpoint error during Google Pub/Sub subscription creation

心已入冬 提交于 2019-12-10 14:18:45
问题 I've gone through all the prereqs on Google's site. Got and installed a SSL cert (from Let's Encrypt) on server. Registered and verified the domain (yes, the https url) on Google Search Console (like https://example.org). Added to the domain to my API Credential's Domain Verification. topic='projects/myproject/subscriptions/mytopic' sub='projecs/myproject/subscription/mysub' client.projects().topics().create(topic=topic, body={}).execute() client.projects().subscriptions().create(name=sub,

what's google cloud pub/sub latency

烂漫一生 提交于 2019-12-10 13:49:16
问题 I have an application that requires really low latency (real time game). Currently in my solution it takes less than 2 mili sec for a message to route to from the client front end server to the destination server. Does anybody know how much time will it take in google cloud pub/sub to route a message from one server to another? Thank you! 回答1: While Cloud Pub/Sub's end-to-end latency at the 99.9th percentile is sufficient for many applications--including some using it for real-time

Beam / Dataflow Custom Python job - Cloud Storage to PubSub

走远了吗. 提交于 2019-12-10 10:25:43
问题 I need to perform a very simple transformation on some data (extract a string from JSON), then write it to PubSub - I'm attempting to use a custom python Dataflow job to do so. I've written a job which successfully writes back to Cloud Storage, but my attempts at even the simplest possible write to PubSub (no transformation) result in an error: JOB_MESSAGE_ERROR: Workflow failed. Causes: Expected custom source to have non-zero number of splits. Has anyone successfully written to PubSub from

Subscriber.stopAsync() results in RejectedExecutionException

时光毁灭记忆、已成空白 提交于 2019-12-08 17:40:36
My code is basically following the official tutorials and the main purpose is to collect all messages from one subscription (Constants.UNFINISHEDSUBID) and republish them on another. But currently I'm facing a problem, that i can't solve. In my implementation calling subscriber.stopAsync() results in the following exception: Mai 04, 2017 4:59:25 PM com.google.common.util.concurrent.AbstractFuture executeListener SCHWERWIEGEND: RuntimeException while executing runnable com.google.common.util.concurrent.Futures$6@6e13e898 with executor java.util.concurrent.Executors

Run synchronous pull in Google Cloud Pub/Sub with the Python client API

限于喜欢 提交于 2019-12-08 16:55:24
问题 I can't find the returnImmediately flag in the Python client API. Is there any specific reason for that? Is there another way to pull queued message synchronously from a subscription in Python? 回答1: Google doesn't provide something like this. But you can easily workaround it by implementing your own Queue from Queue import Queue from google.cloud import pubsub subscriber = pubsub.SubscriberClient() topic = "projects/newproject-xxxxx/topics/tarunlalwani" subscription_name = 'projects