google-cloud-pubsub

Google API: Gmail Service Push Notification (Watch) - User not authorized to perform this action

。_饼干妹妹 提交于 2019-12-06 10:52:34
问题 I am trying to call watch() on a mailbox. I set up a service Iam account, and created a topic and suscription. I gave my service account full(owner) rights to my topic and subscription. But when calling execute on watch(), I get the error: Google.Apis.Requests.RequestError Error sending test message to Cloud PubSub projects/projectid/topics/topicname : User not authorized to perform this action. [403] Errors [ Message[Error sending test message to Cloud PubSub projects/projectid/topics

How to use existing PubSub Subscription with Google-Provided PubSub to BigQuery Dataflow Template

早过忘川 提交于 2019-12-06 09:24:11
I am trying to setup a Dataflow job using the google provided template PubSub to BigQuery . I see an option to specify the Cloud Pub/Sub input topic but I don't see any option to specify Pub/Sub input subscription in GCP console UI. If I provide the topic, job would automatically create a subscription to read the messages from the provided topic. Problem with this is, the job will see only messages published to the topic after the Dataflow job has started. Anything published before to the same topic would be ignored. I don't have any complex transformations to do in my job. So the google

Securing PubSub push endpoints in node app engine?

我们两清 提交于 2019-12-06 08:11:10
I'm using pubsub to push messages into an App Engine app written in node on the flexible environment. Is there a way I can limit my endpoints to only traffic from pubsub? In the standard environment, App Engine has handlers that can define admin only requests and secure endpoints. However, this functionality is not available in the flexible environment. Is it possible to set up Firewall rules for only Google requests (Firewall appears to be application wide, not endpoint?), is there a standard method to secure endpoints or do I need to custom roll a solution? Turns out Google has posted a

Google-Cloud: Jetty ALPN/NPN has not been properly configured

南笙酒味 提交于 2019-12-06 07:21:41
问题 Getting exception whilst using Google Pubsub to list topics, my web application is running on tomcat. public static List<String> listTopics(GcpCredentials gcCredentials, String project) throws GCPException, IOException { List<String> topics = new ArrayList<>(); TopicAdminClient client = getTopicClient(gcCredentials); ProjectName projectName = ProjectName.create(project); ListTopicsPagedResponse response = client.listTopics(projectName); for (Topic topic :response.iterateAll()) { topics.add

Efficient Google PubSub Publishing

瘦欲@ 提交于 2019-12-06 04:08:28
The docs for PubSub state that the max payload after decoding is 10MB. My question is whether or not it is advantageous to compress the payload at the publisher before publishing to increase data throughput? This especially can be helpful if the payload has a high compression ratio like a json formatted payload. If you are looking for efficiency on PubSub I would first concentrate on using the best API, and that's the gRPC one. If are using the client libraries then the chance is high that it's using gRPC anyway. Why gRPC? gRPC is binary and your payload doesn't need to go through hoops to be

Google PubSub python client returning StatusCode.UNAVAILABLE

自古美人都是妖i 提交于 2019-12-06 02:19:19
问题 I am trying to establish a long running Pull subscription to a Google Cloud PubSub topic. I am using a code very similar to the example given in the documentation here, i.e.: def receive_messages(project, subscription_name): """Receives messages from a pull subscription.""" subscriber = pubsub_v1.SubscriberClient() subscription_path = subscriber.subscription_path( project, subscription_name) def callback(message): print('Received message: {}'.format(message)) message.ack() subscriber

Beam / Dataflow Custom Python job - Cloud Storage to PubSub

大憨熊 提交于 2019-12-05 19:23:15
I need to perform a very simple transformation on some data (extract a string from JSON), then write it to PubSub - I'm attempting to use a custom python Dataflow job to do so. I've written a job which successfully writes back to Cloud Storage, but my attempts at even the simplest possible write to PubSub (no transformation) result in an error: JOB_MESSAGE_ERROR: Workflow failed. Causes: Expected custom source to have non-zero number of splits. Has anyone successfully written to PubSub from GCS via Dataflow? Can anyone shed some light on what is going wrong here? def run(argv=None): parser =

Dataflow pipeline and pubsub emulator

你离开我真会死。 提交于 2019-12-05 14:56:46
I'm trying to setup my development environment. Instead of using google cloud pubsub in production, I've been using the pubsub emulator for development and testing. To achieve this I set the following environment variable: export PUBSUB_EMULATOR_HOST=localhost:8586 This worked for the python google pubsub library but when I switched to using java apache beam for google dataflow, the pipeline still points to production google pubsub. Is there a setting, environment variable or method on the pipeline that I need to set so that the pipeline reads for the local pubsub emulator? I found the

Insufficient tokens for quota 'administrator' and limit 'CLIENT_PROJECT-100s' of service 'pubsub.googleapis.com' for consumer 'project_number:#'

有些话、适合烂在心里 提交于 2019-12-05 12:52:51
问题 I sometimes get the following error when creating a subscription: Insufficient tokens for quota 'administrator' and limit 'CLIENT_PROJECT-100s' of service 'pubsub.googleapis.com' for consumer 'project_number:' Waiting it out does the trick, but I'd like to increase the quota. In the IAM & Admin section of the Google Cloud Console, I can filter on the Pub/Sub API, but can't find the limit... 回答1: You are running up against the quota for administrative operations. In the Quotas page, under

Google Pubsub: UNAVAILABLE: The service was unable to fulfill your request

烂漫一生 提交于 2019-12-05 11:06:45
I am using the java library to subscribe to a subscription from my code. Using sbt: "com.google.cloud" % "google-cloud-pubsub" % "0.24.0-beta" I followed this guide to write a subscriber: https://cloud.google.com/pubsub/docs/pull val projectId = "test-topic" val subscriptionId = "test-sub" def main(args: Array[String]): Unit = { val subscriptionName = SubscriptionName.create(projectId, subscriptionId) val subscriber = Subscriber.defaultBuilder(subscriptionName, new PastEventMessageReceiver()).build() subscriber.startAsync() System.in.read() } class PastEventMessageReceiver extends