is it possible to Use Kafka with Google cloud Dataflow

南笙酒味 提交于 2019-12-25 05:35:09

问题


i have two question

1) I want to use Kafka with Google cloud Dataflow Pipeline program. in my pipeline program I want to read data from kafka is it possible?

2) I created Instance with BigQuery enabled now i want to enable Pubsub how can i do ?


回答1:


(1) Ad mentioned by Raghu, support for writing to/reading from Kafka was added to Apache Beam in mid-2016 with the KafkaIO package. You can check the package's documentation[1] to see how to use it.

(2) I'm not quite sure what you mean. Can you provide more details?

[1] https://beam.apache.org/releases/javadoc/current/org/apache/beam/sdk/io/kafka/KafkaIO.html




回答2:


Kafka support was added to Dataflow (and Apache Beam) in mid 2016. You can read and write to Kafka streaming pipelines. See JavaDoc for KafkaIO in Apache Beam.




回答3:


(2) As of April 27, 2015, you can enable Cloud Pub/Sub API as follows:

  1. Go to your project page on the Developer Console
  2. Click APIs & auth -> APIs
  3. Click More within Google Cloud APIs
  4. Click Cloud Pub/Sub API
  5. Click Enable API


来源:https://stackoverflow.com/questions/29893342/is-it-possible-to-use-kafka-with-google-cloud-dataflow

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!