google-cloud-pubsub

Google Pub/Sub Subscriber not receiving messages after a while

久未见 提交于 2019-12-02 03:28:55
问题 I have a simple python script that uses Google pubsub to detect new files in the google cloud storage. The script simply adds new messages to a queue where another thread processes those messages: subscriber = pubsub.SubscriberClient() subscription_path = subscriber.subscription_path( project, subscription_name) subscriber.subscribe(subscription_path, callback=callback_fun) while True: if not message_queue: time.sleep(60) continue else: process_next_message(message_queue.pop()) Here, callback

Google Cloud Function - ImportError: cannot import name 'pubsub' from 'google.cloud' (unknown location)

被刻印的时光 ゝ 提交于 2019-12-02 02:34:48
问题 I am deploying a Google Cloud Function that will kick off other Google Cloud Functions using google.cloud.pubsub_v1 and I'm getting this error ImportError: cannot import name 'pubsub' from 'google.cloud' (unknown location) The beginning of my requirements.txt file looks like this # Function dependencies, for example: # package>=version google-cloud-pubsub .... The beginning of my main.py script looks like this: import base64 import json from google.cloud import pubsub_v1 publisher = pubsub_v1

How do I “create”/“assign” a logging handler for Google Cloud Pubsub?

孤街醉人 提交于 2019-12-02 02:10:49
问题 Development from the previous thread found that the assumptions when asking the question were off-topic (subprocess was actually not causing the problems), so I'm making a more focused post. My error message: No handlers could be found for logger "google.cloud.pubsub_v1.subscriber._protocol.streaming_pull_manager" My intent: Pass on Google PubSub message attributes as Python variables for re-use in later code. My code: import time import logging from google.cloud import pubsub_v1 project_id =

How do I “create”/“assign” a logging handler for Google Cloud Pubsub?

僤鯓⒐⒋嵵緔 提交于 2019-12-02 01:27:28
Development from the previous thread found that the assumptions when asking the question were off-topic (subprocess was actually not causing the problems), so I'm making a more focused post. My error message: No handlers could be found for logger "google.cloud.pubsub_v1.subscriber._protocol.streaming_pull_manager" My intent: Pass on Google PubSub message attributes as Python variables for re-use in later code. My code: import time import logging from google.cloud import pubsub_v1 project_id = "redacted" subscription_name = "redacted" def receive_messages_with_custom_attributes(project_id,

Unable to configure Google Cloud Pub/Sub push subscriber

南楼画角 提交于 2019-12-01 23:23:45
问题 I have Google Cloud project consisting of a compute engine instance which I want to configure as a push subscriber of Cloud Pub/Sub service. I have setup an apache webserver with a self-signed certificate on the instance and have also made a DNS entry (abc.mydomain.com) which points to the instance which has a static IP address. I am already a verified owner of the domain (mydomain.com) on webmasters. Whenever I add the subscription from the Cloud console, it fails with the error: "The

At what stage does Dataflow/Apache Beam ack a pub/sub message?

ⅰ亾dé卋堺 提交于 2019-12-01 23:15:38
问题 I have a dataflow streaming job with Pub/Sub subscription as an unbounded source. I want to know at what stage does dataflow acks the incoming pub/sub message. It appears to me that the message is lost if an exception is thrown during any stage of the dataflow pipeline. Also I'd like to know how to the best practices for writing dataflow pipeline with pub/sub unbounded source for message retrieval on failure. Thank you! 回答1: The Dataflow Streaming Runner acks pubsub messages received by a

At what stage does Dataflow/Apache Beam ack a pub/sub message?

时间秒杀一切 提交于 2019-12-01 21:47:53
I have a dataflow streaming job with Pub/Sub subscription as an unbounded source. I want to know at what stage does dataflow acks the incoming pub/sub message. It appears to me that the message is lost if an exception is thrown during any stage of the dataflow pipeline. Also I'd like to know how to the best practices for writing dataflow pipeline with pub/sub unbounded source for message retrieval on failure. Thank you! The Dataflow Streaming Runner acks pubsub messages received by a bundle after the bundle has succeeded and results of the bundle (outputs and state mutations etc) have been

Unable to configure Google Cloud Pub/Sub push subscriber

烂漫一生 提交于 2019-12-01 20:45:41
I have Google Cloud project consisting of a compute engine instance which I want to configure as a push subscriber of Cloud Pub/Sub service. I have setup an apache webserver with a self-signed certificate on the instance and have also made a DNS entry (abc.mydomain.com) which points to the instance which has a static IP address. I am already a verified owner of the domain (mydomain.com) on webmasters. Whenever I add the subscription from the Cloud console, it fails with the error: "The subscription could not be added" and does not show any other useful information. Please help. Takashi Matsuo

Watch request in gmail API doesn't work

血红的双手。 提交于 2019-12-01 19:56:59
I am trying to make a watch request using python as referred to in the google APIs but it does not work. request = { 'labelIds': ['INBOX'], 'topicName': 'projects/myproject/topics/mytopic' } gmail.users().watch(userId='me', body=request).execute() I could not find a library or a package to use gmail.users() function. How do I make a watch request using an access token? Do it in gmail python client(provide by google).under main function request = { 'labelIds': ['INBOX'], 'topicName': 'projects/myprojects/topics/getTopic' } print(service.users().watch(userId='me', body=request).execute()) const

Unable to publish messages to GCP Pub/Sub using Python SDK when executed via Cron inside a GKE POD

二次信任 提交于 2019-12-01 09:56:13
问题 Using a PYthon SDK to publish messages to GCP Pub/SUb. THe code is running inside a Kubernetes POD on GKE. import pymysql import os import argparse import time from google.cloud import pubsub_v1 entries = ['jelly'] def publish_messages(project, topic_name): publisher = pubsub_v1.PublisherClient() topic_path = publisher.topic_path(project, topic_name) for n in entries: data = u'Message number {}'.format(n) data = data.encode('utf-8') publisher.publish(topic_path, data=data) print "Message %s