google-cloud-pubsub

io.grpc.StatusRuntimeException: PERMISSION_DENIED: User not authorized to perform this action

纵饮孤独 提交于 2020-04-05 06:48:12
问题 I try to implement simple example of spring cloud config + spring cloud bus . So I've implemented Client server application Config server application and in the separated repository I've put application.properties file. Looks like everything is correct on the server side. But on client side I see error when application is starting: org.springframework.cloud.stream.binder.BinderException: Exception thrown while building outbound endpoint at org.springframework.cloud.stream.binder

io.grpc.StatusRuntimeException: PERMISSION_DENIED: User not authorized to perform this action

我与影子孤独终老i 提交于 2020-04-05 06:48:08
问题 I try to implement simple example of spring cloud config + spring cloud bus . So I've implemented Client server application Config server application and in the separated repository I've put application.properties file. Looks like everything is correct on the server side. But on client side I see error when application is starting: org.springframework.cloud.stream.binder.BinderException: Exception thrown while building outbound endpoint at org.springframework.cloud.stream.binder

Spring dataflow and GCP Pub Sub

断了今生、忘了曾经 提交于 2020-03-04 16:41:22
问题 I'm building an event-driven microservice architecture, which is supposed to be Cloud agnostic (as much as possible). Since this is initially going in GCP and I don't want to spend a long time in configurations and all that, I was going to use GCP's Pub/Sub directly for the event queue and would take care of other Cloud implementations later, but then I came across Spring Cloud Dataflow, which seemed nice because these are Spring Boot microservices and I needed a way to orchestrate them. Does

Trigger Cloud Composer DAG with a Pub/Sub message

走远了吗. 提交于 2020-02-25 04:13:14
问题 I am trying to create a Cloud Composer DAG to be triggered via a Pub/Sub message. There is the following example from Google which triggers a DAG every time a change occurs in a Cloud Storage bucket: https://cloud.google.com/composer/docs/how-to/using/triggering-with-gcf However, on the beginning they say you can trigger DAGs in response to events, such as a change in a Cloud Storage bucket or a message pushed to Cloud Pub/Sub . I have spent a lot of time try to figure out how that can be

How do you access the message id from Google Pub/Sub using Apache Beam?

眉间皱痕 提交于 2020-02-24 12:21:19
问题 I have been testing Apache Beam using the 2.13.0 SDK on Python 2.7.16, pulling simple messages from a Google Pub/Sub subscription in streaming mode, and writing to a Google Big Query table. As part of this operation, I'm trying to use the Pub/Sub message id for deduplication, however I can't seem to get it out at all. The documentation for the ReadFromPubSub method and PubSubMessage type suggests that service generated KVs such as id_label should be returned as part of the attributes property

Stream BigQuery table into Google Pub/Sub

独自空忆成欢 提交于 2020-02-03 07:23:43
问题 I have a Google bigQuery Table and I want to stream the entire table into pub-sub Topic what should be the easy/fast way to do it? Thank you in advance, 回答1: That really depends on the size of the table. If it's a small table (a few thousand records, a couple doze columns) then you could setup a process to query the entire table, convert the response into a JSON array, and push to pub-sub. If it's a big table (millions/billions of records, hundreds of columns) you'd have to export to file,

How to publish to pub/sub with just an api key

僤鯓⒐⒋嵵緔 提交于 2020-01-25 07:18:30
问题 I need to publish messages to GCP Pub/Sub with a POST request as the platform I'm using (Zoho) does not allow for any of the GCP libraries. I'm not sure how to make the request in a simple way, as the normal authentication system seems complex. Is there an easy way to publish a message using, e.g., an API key? Alternatively is there a simple way to create an API endpoint within GCP that I can then forward data on to the messaging system? I have used the python client to publish to Pub/Sub,

How to trigger a dataflow with a cloud function? (Python SDK)

北城余情 提交于 2020-01-25 06:49:27
问题 I have a cloud function that is triggered by cloud Pub/Sub. I want the same function trigger dataflow using Python SDK. Here is my code: import base64 def hello_pubsub(event, context): if 'data' in event: message = base64.b64decode(event['data']).decode('utf-8') else: message = 'hello world!' print('Message of pubsub : {}'.format(message)) I deploy the function this way: gcloud beta functions deploy hello_pubsub --runtime python37 --trigger-topic topic1 回答1: You have to embed your pipeline

Apache Beam pipeline with PubSubIO error using Spark Runner PubsubUnboundedSource$PubsubReader.getWatermark(PubsubUnboundedSource.java:1030)

丶灬走出姿态 提交于 2020-01-25 06:42:08
问题 A beam pipeline with PubSubIO is running fine as Direct Runner and Dataflow runner, however when I run it on Spark Runner (standalone Spark instance) I get a PubSubUnboundedSource error. This is the piece of code where I read in from a GCP PubSub subscription, parse the contents contained in the PubSub message into an object with a DoFn, extract event time from the object and window the resulting Pcollection into 20 second windows: // Take input from pubsub and make pcollections of

Does Google Pub/Sub queue or topic?

给你一囗甜甜゛ 提交于 2020-01-24 00:28:07
问题 I am familiar with JMS and novice with Google Pub/Sub. In JMS there are 2 options: Queue : only one consumer can accept message. Topic : each consumer accepts each message from the topic I believe that Google Pub/Sub should support something like this, but a quick Googling didn't help me to answer that question. Please point me out to the corresponding documentation part. 回答1: As the name "Pub/Sub" indicates, Google Pub/Sub supports publish/subscribes semantics which correspond to JMS topics.