google-cloud-pubsub

Google pubsub into HTTP triggered cloud function?

为君一笑 提交于 2020-01-23 08:12:13
问题 Is it possible to trigger an HTTP cloud function in response to a pubsub message? When editing a subscription, google makes it possible to push the message to an HTTPS endpoint, but for abuse reasons one has to be able to prove that you own the domain in order to do this, and of course you can't prove that you own google's own *.cloudfunctions.net domain which is where they get deployed. The particular topic I'm trying to subscribe to is a public one, projects/pubsub-public-data/topics

Beam/Google Cloud Dataflow ReadFromPubsub Missing Data

守給你的承諾、 提交于 2020-01-23 03:32:07
问题 I have 2 dataflow streaming pipelines (pubsub to bigquery) with the following code : class transform_class(beam.DoFn): def process(self, element, publish_time=beam.DoFn.TimestampParam, *args, **kwargs): logging.info(element) yield element class identify_and_transform_tables(beam.DoFn): #Adding Publish Timestamp #Since I'm reading from a topic that consist data from multiple tables, #function here is to identify the tables and split them apart def run(pipeline_args=None): # `save_main_session`

Google PubSub - getting last message

浪子不回头ぞ 提交于 2020-01-17 06:54:38
问题 I am beginning to work with google app engine and the services it offers. One service that I am particularly interested in is the cloud pubsub. My plan is to have a nodejs socket.io server which subscribes to some pubsub topic and whenever the topic receives a publish it would send that publish to all sockets. At the other end, I would have a .NET/Java service that publishes messages to various topics. My question is this - Is there a way to get the last message that was published to a topic?

Google PubSub - getting last message

谁都会走 提交于 2020-01-17 06:54:24
问题 I am beginning to work with google app engine and the services it offers. One service that I am particularly interested in is the cloud pubsub. My plan is to have a nodejs socket.io server which subscribes to some pubsub topic and whenever the topic receives a publish it would send that publish to all sockets. At the other end, I would have a .NET/Java service that publishes messages to various topics. My question is this - Is there a way to get the last message that was published to a topic?

How to log incoming messages in apache beam pipeline

为君一笑 提交于 2020-01-16 19:06:49
问题 I am writing a simple apache beam streaming pipeline, taking input from a pubsub topic and storing this into bigquery. For hours I thought I am not able to even read a message, as I was simply trying to log the input to console: events = p | 'Read PubSub' >> ReadFromPubSub(subscription=SUBSCRIPTION) logging.info(events) When I write this to text it works fine! However my call to the logger never happens. How to people develop / debug these streaming pipelines? I have tried adding the

Avro message for Google Cloud Pub-Sub?

一世执手 提交于 2020-01-15 10:23:16
问题 What is a best data format for publishing and consuming to/from Pub-Sub? I am looking at Avro message format due to it's binary format. Usecases are there would be real time Microservice applications publishing Avro messages to pub-sub. Given that avro message is best suited when batching up messages(along with a schema attached with the binary message) and then publishing the messages, would that be a better suitable format for this usecase involving microservice? 回答1: Google Cloud

Consumer example for Google Pub/Sub in C++

£可爱£侵袭症+ 提交于 2020-01-15 09:59:42
问题 I am trying to play around Google Pub/Sub and I need to integrate it in C++ code-base. As there is no native support for Google Pub/Sub in C++, I am using it through gRPC . Thus, I have generated corresponding pubsub.grpc.pb.h , pubsub.grpc.pb.cc , pubsub.pb.h and pubsub.pb.cc files via protoc . Question part: because of lack of documentation it would be very helpful to have an example in C++. I have found an example for publisher part, but not for the subscriber part. I tried to dive into

Google PubSub Simultaneous Publish Requests

坚强是说给别人听的谎言 提交于 2020-01-14 19:29:07
问题 In Google PubSub, the publish call from the client can be called asynchronously. Because of this, I would think that it would be possible to have multiple publish requests triggered and sent to the server, all at the same time, especially if the batch thresholds are too low. If this is true, how does the pubsub client control the number of simultaneous publish requests that can be created? Is there a hard limit, or an error that can occur if too many requests are created? Is this the intended

Batching PubSub requests

点点圈 提交于 2020-01-13 20:23:30
问题 The NODEJS example code for batching pubsub requests looks like this: // Imports the Google Cloud client library const PubSub = require(`@google-cloud/pubsub`); // Creates a client const pubsub = new PubSub(); /** * TODO(developer): Uncomment the following lines to run the sample. */ // const topicName = 'your-topic'; // const data = JSON.stringify({ foo: 'bar' }); // const maxMessages = 10; // const maxWaitTime = 10000; // Publishes the message as a string, e.g. "Hello, world!" or JSON

Listing the Pub/Sub subscription consumers

放肆的年华 提交于 2020-01-11 11:51:50
问题 It is possible for a service to do long polling on a pub/sub subscription. That obviously requires a TCP connection to be constantly open between the pub/sub service and the client. Is there any way to find out if a certain subscription has that TCP connection open? 回答1: There is no way to list all consumers for a Pub/Sub subscription, no. This would be tough as there are three different types of subscribers: push, pull, and streaming pull. Only the last one maintains an open connection to