google-cloud-pubsub

com.google.cloud.pubsub.spi.v1.Publisher.publish is not sending data to PubSub

馋奶兔 提交于 2019-12-12 10:58:07
问题 The call to the newer version of com.google.cloud.pubsub.spi.v1.Publisher.publish(pubsubMessage).get() is hanging forever. I'm not sure what the problem is. Code snippet: com.google.cloud.pubsub.spi.v1.Publisher publisher = Publisher.defaultBuilder(TopicName.parse("projects/" + projectId + "/topics/" + topicName)) .setChannelProvider(TopicAdminSettings .defaultChannelProviderBuilder() .setCredentialsProvider(FixedCredentialsProvider.create(ServiceAccountCredentials.fromStream(new

Google Cloud Pub/Sub - INVALID ARGUMENT error in Push subscription

こ雲淡風輕ζ 提交于 2019-12-12 10:29:18
问题 I created a topic & then tried to create a Subscription with "Push" delivery type. When I use the domain https://www.omnea.com/push/google-handler (the link doesn't exist), it works. However, when I use the url https://apidev2.omnea.org/push/google-handler, it throws INVALID_ARGUMENT error. Both have the same certificate authority. The only difference I see here is the domain & sub-domain. Is there any other reason due to which I receive this error? 回答1: Got some hard time too figuring out

Google Pub/Sub push message not working for IAP enabled app engine

拈花ヽ惹草 提交于 2019-12-12 09:20:01
问题 I am testing out a very basic Pub/Sub subscription. I have the push endpoint set to an App I have deployed through a Python Flex service in App Engine. The service is in a project with Identity-Aware Proxy enabled. The IAP is configured to allow through users authenticated with our domain. I do not see any of the push requests being processed by my app. I turned off the IAP protection and then I see that the requests are processed. I turn it back on and they are no longer processed. 回答1: Note

Need Help creating GMAIL Pub/Sub Notification service to SpreadsheetApp (Google Appscript)

对着背影说爱祢 提交于 2019-12-12 09:09:59
问题 I wish I didn't have to repost this question, but my boss pushed this one up to high priority, and I need help to sort this out. I'm trying to use a GAS script to pull pub/sub notifications from an address on my GSuite Domain (currently, I'm testing on mine). Basically, I'm trying to accomplish what is described in all this material: 1)Great Github Project from Spencer Easton (Instructional Video) 2) pubsub API for GMAIL 3) Notification Help 4) Real-time notifications 5)Endpoint documentation

Google Cloud Pubsub authentication error from App Engine

依然范特西╮ 提交于 2019-12-12 08:58:35
问题 We're having trouble publishing messages to a Google Cloud PubSub topic on Google AppEngine. Using the Application Default credentials works perfect locally. But once it's deployed on Google AppEngine it gives the following error: <HttpError 403 when requesting https://pubsub.googleapis.com/v1/projects/our-project-id/topics/our-topic:publish?alt=json returned "The request cannot be identified with a project. Please pass a valid API key with the request."> I would assume that it's will use the

Delay message processing and delete before processing

淺唱寂寞╮ 提交于 2019-12-12 08:24:33
问题 I need this ability to send push notifications for an action in a mobile app but wait for the user to undo the action until say 10 seconds. Is it possible to delay the processing of a message published in a topic by 10 seconds ? And then (sometimes, if user does undo) delete the message before 10 seconds, if it doesn't need to be processed ? 回答1: Depends on if you write the subscribers as well or not: You have control over your subscribers code: In your Pubsub messages add a timestamp for

Kafka PubSub Connector: Jetty ALPN/NPN has not been properly configured

自闭症网瘾萝莉.ら 提交于 2019-12-12 03:58:33
问题 I am using kafka_2.11-0.10.2.1 and the pubsub connector provided by google here. All I care to do is push data from a Kafka Topic to a PubSub one using a standalone connector. I followed all steps as I should have: Produced the cps-kafka-connector.jar Added the cps-sink-connector.properties file in kafka's config directory. The file looks like this: name=CPSConnector connector.class=com.google.pubsub.kafka.sink.CloudPubSubSinkConnector tasks.max=10 topics=kafka_topic cps.topic=pubsub_topic

Google pubsub to Google cloud storage

↘锁芯ラ 提交于 2019-12-12 03:04:58
问题 Is it possible for a bucket in cloud storage to receive data/messages from pubSub? if yes then how?? Currently i am publishing messages to pubsub and i want to use pull delivery type (for that i have to provide endpoint URL for the bucket, which i couldn't find anywhere) I found this somewhere in there docs But it didn't work. 回答1: No, sorry. GCS only accepts uploads of complete files via HTTP. You could build a small app that took incoming Pub/Sub messages and uploading them as separate GCS

Please migrate off JSON-RPC and Global HTTP Batch Endpoints - Dataflow Template

﹥>﹥吖頭↗ 提交于 2019-12-12 01:15:16
问题 I received an email with the Title ^ as the subject. Says it all. I'm not directly using the specified endpoint (storage@v1). The project in question is a postback catcher that funnels data into BigQuery App Engine > Pub Sub > Dataflow > Cloud Storage > BigQuery Related question here indicates Dataflow might be indirectly using it. I'm only using the Cloud PubSub to GCS Text template. What is the recommended course of action if I'm relying on a template? 回答1: I think the warning may come from

How do I “create”/“assign” a logging handler for Google Cloud Pubsub?

落花浮王杯 提交于 2019-12-11 19:29:54
问题 Development from the previous thread found that the assumptions when asking the question were off-topic (subprocess was actually not causing the problems), so I'm making a more focused post. My error message: No handlers could be found for logger "google.cloud.pubsub_v1.subscriber._protocol.streaming_pull_manager" My intent: Pass on Google PubSub message attributes as Python variables for re-use in later code. My code: import time import logging from google.cloud import pubsub_v1 project_id =