google-cloud-pubsub

Typescript import @google-cloud/pubsub

妖精的绣舞 提交于 2019-12-24 08:41:34
问题 I want to import a non-TypeScript module into a TypeScript project. This project does not have own declarations or @types declarations, so I created my own declarations for the module. But when I declare the module in the declaration file, I get the following error: Invalid module name in augmentation. Module '@google-cloud/pubsub' resolves to an untyped module at './node_modules/@google-cloud/pubsub/src/index.js', which cannot be augmented. I'm using TypeScript 2.2.2 Here is the complete

How to setup google pub/sub subscription to call firebase function

不想你离开。 提交于 2019-12-24 07:49:07
问题 I want to have a push subscription, but when I try to add this firebase function https://us-central1-myproject-dev.cloudfunctions.net/api/conversation as Endpoint Url, it says The supplied URL is not registered in the subscription's parent project. Please see documentation on domain ownership validation . The problem is I can't verify that https://us-central1-myproject-dev.cloudfunctions.net is my domain, which of course is not Any suggestion? 回答1: I found a solution in case any body else has

How to modify Google Cloud Pub/Sub subscription acknowledgement deadline for background Cloud Function

天大地大妈咪最大 提交于 2019-12-24 07:17:36
问题 When deploying a background Cloud Function for Cloud Pub/Sub via: gcloud functions deploy function_name --runtime python37 --trigger-topic some_topic A subscription gets automatically created with a push endpoint (likely App Engine standard endpoint, but those are claimed to be without the need of domain verification https://cloud.google.com/pubsub/docs/push#other-endpoints). For the generated subscription/endpoint there doesn't seem like a way to register/verify the domain (https://www

ImportError: cannot import name 'pubsub_v1' from 'google.cloud' (unknown location)

白昼怎懂夜的黑 提交于 2019-12-24 06:11:15
问题 I am trying to import the pubsub_v1 in a cloud function. But when I tried to deploy it on GCP, the problem as in the title coming out. The requirements.txt file is in the same directory as the main.py file. Here is what in the requirements.txt: google-api-core==1.3.0 google-auth==1.5.1 google-cloud-core==0.28.1 google-cloud-storage==1.10.0 google-resumable-media==0.3.1 googleapis-common-protos==1.5.3 google-api-python-client==1.7.4 oauth2client==4.1.2 google-cloud-bigquery==1.5.0 google-cloud

Google dataflow write to mutiple tables based on input

自作多情 提交于 2019-12-24 01:17:38
问题 I have logs which I am trying to push to Google BigQuery. I am trying to build the entire pipeline using google dataflow. The log structure is different and can be classified into four different type. In my pipeline I read logs from PubSub parse it and write to BigQuery table. The table to which the logs need to written is depending on one parameter in logs. The problem is I am stuck on a point where how to change TableName for BigQueryIO.Write at runtime. 回答1: You can use side outputs. https

Google pubsub golang subscriber stops receiving new published message(s) after being idle for a few hours

痴心易碎 提交于 2019-12-24 00:11:37
问题 I created a TOPIC in google pubsub, and created a SUBSCRIPTION inside the TOPIC, with the following settings then I wrote a puller in go, using its Receive to pull and acknowledge published messages package main import ( ... ) func main() { ctx := context.Background() client, err := pubsub.NewClient(ctx, config.C.Project) if err != nil { // do things with err } sub := client.Subscription(config.C.PubsubSubscription) err := sub.Receive(ctx, func(ctx context.Context, msg *pubsub.Message) { msg

Autoscaling GCE Instance groups based on Cloud pub/sub queue

删除回忆录丶 提交于 2019-12-23 22:51:07
问题 Can GCE Instance groups be scaled up/down bases on Google Cloud PubSub queue counts or other asynchronous task queues such as PSQ? 回答1: Yes! The feature is now in alpha: https://cloud.google.com/compute/docs/autoscaler/scaling-queue-based 回答2: I haven't tried this myself but looking at the documentation, it looks possible to set up autoscaling against Pub/Sub message queue counts. This page [0] explains how to setup autoscaler to scale based on a standard metric provided by the Cloud

Google PubSub error [code=8a75]

雨燕双飞 提交于 2019-12-23 12:20:14
问题 Today, I started getting this error sporadically. Google pubsub error codes talks about only HTTP error codes. Does anyone know about this error? ERROR Error: The service was unable to fulfill your request. Please try again. [code=8a75] 回答1: This error code is retryable, and can be safely expected. Automating your code to automatically retry with backoff, or to use one of the official client libraries, which automatically retry on these errors with backoff is the recommended solution. In

How to send custom data with Cloud Pub/Sub when GCS object is uploaded via a Signed URL

China☆狼群 提交于 2019-12-23 05:44:10
问题 I was able to set up Google Cloud Storage Cloud Pub/Sub notifications using: gsutil notification create -t [TOPIC_NAME] -m my-key:my-value -f json gs://[BUCKET_NAME] My App Engine servlet correctly gets a message every time an object is uploaded to GCS. I upload my object to GCS with a Signed URL. However, I'm not sure how to set custom key-value pairs from my client when uploading an object with the Signed URL. The above gsutil command lets you set a key:value pair but it hard-codes it so

Dart Error: error: import of dart:mirrors is not supported in the current Dart runtime

。_饼干妹妹 提交于 2019-12-23 04:34:20
问题 I'm currently trying to make some mobile code with Flutter. I'm trying to publish/subscribe data to GCP Cloud Pub/Sub using gcloud library dart. Here is the code for the main.dart: import 'dart:io'; import 'package:googleapis_auth/auth_io.dart' as auth; import 'package:http/http.dart' as http; import 'package:gcloud/db.dart'; import 'package:gcloud/storage.dart'; import 'package:gcloud/pubsub.dart'; import 'package:gcloud/service_scope.dart' as ss; import 'package:gcloud/src/datastore_impl