google-cloud-pubsub

How to use google provided template [pubsub to Datastore]?

女生的网名这么多〃 提交于 2019-12-25 00:17:53
问题 I want to use this google provided template which stream data from pubsub to datastore. https://github.com/GoogleCloudPlatform/DataflowTemplates/blob/master/src/main/java/com/google/cloud/teleport/templates/PubsubToDatastore.java I follow the step wrote this document. https://github.com/GoogleCloudPlatform/DataflowTemplates I pass this step. mvn clean && mvn compile But next step, the error occured. [INFO] --- exec-maven-plugin:1.6.0:java (default-cli) @ google-cloud-teleport-java --- 2018-08

SubscriberClient.pull - how to cancel a request?

跟風遠走 提交于 2019-12-24 21:26:41
问题 public final PullResponse pull(SubscriptionName subscription, boolean returnImmediately, int maxMessages) According to the documentation: @param returnImmediately If this field set to true, the system will respond immediately even if there are no messages available to return in the Pull response. Otherwise, the system may wait (for a bounded amount of time) until at least one message is available, rather than returning no messages. The client may cancel the request if it does not wish to wait

Write pubsub data to gcs through dataflow

馋奶兔 提交于 2019-12-24 20:47:51
问题 I would like to consume data from pubsub through dataflow streaming job and store it into GCS in hourly directories. What would be best approach? I tried using WindowedFilenamePolicy but it adds an additional group by and slows down the write operation at the time of writes. Dataflow buffers the data correctly but takes too long to write data in temp bucket. Any best practice for such fairly common case? Regards, Pari 回答1: Using the Google-Provided Dataflow Template for the streaming pipeline

GAE - Nodejs - Memory leak for a simple pubsub app - Exceeded soft private memory limit

给你一囗甜甜゛ 提交于 2019-12-24 20:15:36
问题 I wrote a simple appEngine using pubsub app. When looking at the appEngine logs i saw that memory is constantly increasing and dropping and it keeps reoccuring. When i looked at the logs. I got the below error message. Basically what i am doing is, i have setup a cron task to trigger this route every minute, the express route will publish the message to pubsub. For this simple task, i am seeing memory usage consistently increasing from 89MB to 131MB and in the next upcoming trigger it fails.

Google Cloud Functions Cron Job Not Working

风格不统一 提交于 2019-12-24 19:49:36
问题 I am trying to set up a scheduled function in Firebase Cloud Functions. As a simple test, I have tried to recreate the sample shown on the documentation page: const functions = require('firebase-functions') exports.scheduledFunction = functions.pubsub .schedule('every 5 minutes') .onRun(context => { console.log('This will be run every 5 minutes!') return null }) However, when I run firebase serve --only functions , I get the following error: function ignored because the pubsub emulator does

Google Cloud Functions Cron Job for API Call

空扰寡人 提交于 2019-12-24 19:49:03
问题 I am trying to set up a firebase cloud function that regularly makes an api call to the Feedly API. However, it is not working and I'm not sure why. Here is the code: const functions = require('firebase-functions') const express = require('express') const fetch = require('node-fetch') const admin = require('firebase-admin') admin.initializeApp() const db = admin.firestore() const app = express() exports.getNewsArticles = functions.pubsub .schedule('every 5 minutes') .onRun(() => { app.get('

Is it possible to define a schema for Google Pub/Sub topics like in Kafka with AVRO?

戏子无情 提交于 2019-12-24 15:54:42
问题 As far as I know, we can define AVRO schemas on Kafka and the topic defined with this schema will only accept the data matching with that schema. It's really useful to validate data structure before accepting into the queue. Is there anything similar in Google Pub/Sub? 回答1: Kafka itself is not validating a schema, and topics therefore do not inherently have schemas other than a pair of byte arrays plus some metadata. It's the serializer that's part of the producing client that performs the

Cloud Functions triggered by Cloud PubSub duplicate messages

非 Y 不嫁゛ 提交于 2019-12-24 10:58:45
问题 I'm experimenting with using Cloud Functions as async background worker triggered by PubSub and doing a bit longer work (in order of minutes). The complete code is here https://github.com/zdenulo/cloud-functions-pubsub My prototype inserts data into BigQuery and waits for a few minutes (to mimic longer task). I am publishing 100 messages to PubSub topic (with 1 second interval). It's emphasized that PubSub can deliver more than once the same message, but I was surprised that from 10 to 40 out

Pubsub latency reaching minutes

主宰稳场 提交于 2019-12-24 09:08:22
问题 I've been working on a project utilizing the Gcloud pubsub platform on the node.js flexible runtime and for some reason have had some pretty crazy latency that has been increasing in severity over time. At first, only messages of a certain kind would sometimes experience heavy latency. However, as I've continued working over the past few days, now all messages are experiencing latency reaching over several minutes regardless of type. It's reached the point where testing has become impossible

Gmail API: Watch Inbox label Only

自古美人都是妖i 提交于 2019-12-24 08:50:09
问题 I have the code below to watch a mailbox. As seen, in the lableIds, I only have one label, INBOX because I want to listen to only new messages. However, when I run this, it receives notifications every 30s or so with different messageid. Yet no change occurred in the INBOX, no new item was added/removed. How do I set my watch body to only listen to the INBOX for incoming messages? certificate= new X509Certificate2("file.p12"), "password", X509KeyStorageFlags.Exportable); credential = new