google-cloud-pubsub

Dataflow Template Cloud Pub/Sub Topic vs Subscription to BigQuery

自闭症网瘾萝莉.ら 提交于 2019-11-30 20:30:47
问题 I'm setting up a simple Proof of Concept to learn some of the concepts in Google Cloud, specifically PubSub and Dataflow. I have a PubSub topic greeting I've created a simple cloud function that sends publishes a message to that topic: const escapeHtml = require('escape-html'); const { Buffer } = require('safe-buffer'); const { PubSub } = require('@google-cloud/pubsub'); exports.publishGreetingHTTP = async (req, res) => { let name = 'no name provided'; if (req.query && req.query.name) { name

GCP - Verify ownership of a cloud function https endpoint for a PubSub push

微笑、不失礼 提交于 2019-11-30 12:52:14
Pretty sure there's no way to do this but would be great to reach out to see if anyone else has any ideas. What I'm trying to do is this: I have 2 microservices hosted on Google Cloud Platform as cloud functions My first microservices does stuff and fires a PubSub message with topic [x] I'd like to set my second microservice up as a push subscriber to the topic [x]. I know I can do this by deploying the 2nd cloud function with a subscription trigger but I don't want to do this as there's no decent way to acknowledge/reject the message (see this post: Google Cloud Functions to only Ack Pub/Sub

Google Cloud Functions to only Ack Pub/Sub on success

扶醉桌前 提交于 2019-11-30 08:55:26
We are using a cloud function triggered by Pub/Sub to ensure delivery of an e-mail. Sometimes the e-mail service takes a long time to respond and our cloud function terminates before we get an error back. Since the message has already been acknowledged our e-mail gets lost. The cloud function appears to be sending an ACK the Pub/Sub message automatically when we are called. Is there a way to delay the ACK until the successful completion of our code? Alternatively is there a way to catch timeouts and requeue the message for delivery? Something else we could try? I heard from Google support that

Writing to Google Cloud Storage from PubSub using Cloud Dataflow using DoFn

淺唱寂寞╮ 提交于 2019-11-30 06:25:10
问题 I am trying write Google PubSub messages to Google Cloud Storage using Google Cloud Dataflow. I know that TextIO/AvroIO do not support streaming pipelines. However, I read in [1] that it is possible to write to GCS in a streaming pipeline from a ParDo/DoFn in a comment by the author. I constructed a pipeline by following their article as closely as I could. I was aiming for this behaviour: Messages written out in a batches of up to 100 to objects in GCS (one per window pane) under a path that

GKE: Pubsub messages between pods with push subscribers

偶尔善良 提交于 2019-11-29 15:52:16
问题 I am using GKE deployment with multiple pods and I need to send and receive messages between pods. I want to use pubsub push subscribers. I found for push I need to configure https access for subscribers pods. In order to receive push messages, you need a publicly accessible HTTPS server to handle POST requests. The server must present a valid SSL certificate signed by a certificate authority and routable by DNS. You also need to validate that you own the domain (or have equivalent access to

GCP - Verify ownership of a cloud function https endpoint for a PubSub push

佐手、 提交于 2019-11-29 12:51:20
问题 Pretty sure there's no way to do this but would be great to reach out to see if anyone else has any ideas. What I'm trying to do is this: I have 2 microservices hosted on Google Cloud Platform as cloud functions My first microservices does stuff and fires a PubSub message with topic [x] I'd like to set my second microservice up as a push subscriber to the topic [x]. I know I can do this by deploying the 2nd cloud function with a subscription trigger but I don't want to do this as there's no

Google Cloud Pub/Sub API - Push E-mail

落花浮王杯 提交于 2019-11-29 09:56:21
I'm using node.js to create an app that gets a PUSH from Gmail each time an email is received, checks it against a third party database in a CRM and creates a new field in the CRM if the e-mail is contained there. I'm having trouble using Google's new Cloud Pub/Sub, which seems to be the only way to get push from Gmail without constant polling. I've gone through the instructions here: https://cloud.google.com/pubsub/prereqs but I don't understand how exactly this is supposed to work from an app on my desktop. It seems that pub/sub can connect to a verified domain, but I can't get it to connect

google cloud pubsub node.js client not compatible with google cloud functions

[亡魂溺海] 提交于 2019-11-29 04:54:39
Architecture: We have an architecture using 2 pubsub topic/subscription pairs: Topic T1 is triggered by a cronjob periodically (every 5 minutes for example). Subscription S1 is the trigger for our cloud function. Topic T2 serves as a queue for background jobs that are published by one of our services. Subscription S2 is read by the cloud function on each execution to service the queued background jobs. This allows us to control the frequency the background jobs are serviced independent of when they are added to the queue. The cloud function (triggered by S1 ) reads messages from S2 by pulling

Writing to Google Cloud Storage from PubSub using Cloud Dataflow using DoFn

折月煮酒 提交于 2019-11-28 18:24:15
I am trying write Google PubSub messages to Google Cloud Storage using Google Cloud Dataflow. I know that TextIO/AvroIO do not support streaming pipelines. However, I read in [1] that it is possible to write to GCS in a streaming pipeline from a ParDo/DoFn in a comment by the author. I constructed a pipeline by following their article as closely as I could. I was aiming for this behaviour: Messages written out in a batches of up to 100 to objects in GCS (one per window pane) under a path that corresponds to the time the message was published in dataflow-requests/[isodate-time]/[paneIndex] . I

Google Cloud Pub/Sub Push Messages - Empty POST

你说的曾经没有我的故事 提交于 2019-11-28 11:59:03
问题 I currently have successfully set up a topic and subscription in the google cloud platform, and have verified my site with google and added the domain to the GCP. Whenever I try to send a test message from https://console.cloud.google.com/cloudpubsub/topics/subscription_sync, the endpoint I configured receives something , but the POST variable is empty. Here is the code I have so far in php, it just does simple logging of the POST variable (which later shows up in my logs as empty.) require