How to integrate Storm and Kafka [closed]

南笙酒味 提交于 2019-12-19 04:29:05

问题


I have worked in Storm and developed a basic program which is using a local text file as input source. But now I have to work on streaming data coming continuously from external systems. For this purpose, Kafka is the best choice.

The problem is how to make my Spout get streaming data from Kafka. Or how to integrate Storm with Kafka. How can I do that so I may process data, coming from Kafka?


回答1:


Look for KafkaSpout.

It's a normal Storm Spout implementation that reads from a Kafka cluster. All you need is to configure that spout with parameters like list of brokers, topic name, etc. You can simply then chain the output to corresponding bolts for further processing.

From the same doc mentioned above, the configuration goes like this:

SpoutConfig spoutConfig = new SpoutConfig(
      ImmutableList.of("kafkahost1", "kafkahost2"), // List of Kafka brokers
      8, // Number of partitions per host
      "clicks", // Topic to read from
      "/kafkastorm", // The root path in Zookeeper for the spout to store the consumer offsets
      "discovery"); // An id for this consumer for storing the consumer offsets in Zookeeper

KafkaSpout kafkaSpout = new KafkaSpout(spoutConfig);


来源:https://stackoverflow.com/questions/19782976/how-to-integrate-storm-and-kafka

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!