kafka-consumer-api

Join multiple Kafka topics by key

£可爱£侵袭症+ 提交于 2020-03-16 05:44:27
问题 How can write a consumer that joins multiple Kafka topics in a scalable way? I have a topic that published events with a key and a second topic that publishes other events related to a subset of the first with the same key. I would like to write a consumer that subscribes to both topics and performs some additional actions for the subset that appears in both topics. I can do this easily with a single consumer: read everything from both topics, maintaining state locally and perform the actions

reading only specific messages from kafka topic

♀尐吖头ヾ 提交于 2020-03-05 04:56:06
问题 Scenario: I am writing data JSON object data into kafka topic while reading I want to read an only specific set of messages based on the value present in the message. I am using kafka-python library. sample messages: {flow_status: "completed", value: 1, active: yes} {flow_status:"failure",value 2, active:yes} Here I want to read only messages having flow_Status as completed. 回答1: In Kafka it's not possible doing something like that. The consumer consumes messages one by one, one after the

Kafka Java Consumer already closed

落花浮王杯 提交于 2020-03-02 16:54:12
问题 I have just started using Kafka. I am facing a small issue with the consumer. I have written a consumer in Java. I get this exception - IllegalStateException This consumer has already been closed. I get exception on the following line : ConsumerRecords<String,String> consumerRecords = consumer.poll(1000); This started happening after my consumer crashed with some exception and when I tried running it again it gave me this exception. Here is the complete code : package StreamApplicationsTest;

Kafka Java Consumer already closed

南楼画角 提交于 2020-03-02 16:51:52
问题 I have just started using Kafka. I am facing a small issue with the consumer. I have written a consumer in Java. I get this exception - IllegalStateException This consumer has already been closed. I get exception on the following line : ConsumerRecords<String,String> consumerRecords = consumer.poll(1000); This started happening after my consumer crashed with some exception and when I tried running it again it gave me this exception. Here is the complete code : package StreamApplicationsTest;

Kafka Java Consumer already closed

故事扮演 提交于 2020-03-02 16:50:41
问题 I have just started using Kafka. I am facing a small issue with the consumer. I have written a consumer in Java. I get this exception - IllegalStateException This consumer has already been closed. I get exception on the following line : ConsumerRecords<String,String> consumerRecords = consumer.poll(1000); This started happening after my consumer crashed with some exception and when I tried running it again it gave me this exception. Here is the complete code : package StreamApplicationsTest;

How to pass dynamic topic name to @kafkalistener(topics from environment variable

烂漫一生 提交于 2020-02-25 09:38:32
问题 I AM WRITING A KAFKA CONSUMER I want to pass the environment variable topic name to @kafkalistener(topics = topic import org.springframework.beans.factory.annotation.Autowired; import org.springframework.kafka.annotation.KafkaListener; import org.springframework.stereotype.Service; @Service public class KafkaConsumer { @Autowired private EnvProperties envProperties; private final String topic = envProperties.getTopic(); @KafkaListener(topics = "#{'${envProperties.getTopic()}'}", groupId =

kafka consumer seek is not working: AssertionError: Unassigned partition

杀马特。学长 韩版系。学妹 提交于 2020-02-25 05:47:07
问题 the kafka consumer con defined below works perfectly fine when I try to receive messages form my topic; however, it's giving me trouble when I try to change the offset using seek method or any of its variations. i.e. seek_to_beginning , seek_to_end from kafka import KafkaConsumer, TopicPartition con = KafkaConsumer(my_topic, bootstrap_servers = my_bootstrapservers, group_id = my_groupid) p = con.partitions_for_topic(my_topic) my_partition = p.pop() tp = TopicPartition(topic = my_topic,

Kafka Consumer in play framework using java

安稳与你 提交于 2020-02-16 10:43:11
问题 I have searched thousands of sites for kafka consumer example in play framework in java language. But not able to find any example. Can anyone provide details on how to write a service which consumes topics produced by kafka continuously. Thanks 回答1: Play is a web-framework. The underlying Actor system relies on Akka. The Akka Kafka API is called Alpakka, so I suspect you are searching in the wrong keywords To combine both Akka and Play w/ Kafka, you can even use the Lagom Framework Otherwise

Unable to set kafka spark consumer configs

久未见 提交于 2020-02-16 06:47:19
问题 Me using spark-sql-2.4.x version of with kafka client. Even after setting the consumer configuration parameter i.e. max.partition.fetch.bytes & max.poll.records it is not being set properly and showing default values as below Dataset<Row> df = sparkSession .readStream() .format("kafka") .option("kafka.bootstrap.servers", server1) .option("subscribe", TOPIC1) .option("includeTimestamp", true) .option("startingOffsets", "latest") .option("max.partition.fetch.bytes", "2097152") // default 1000

Does a kafka consumer machine need to run zookeeper?

无人久伴 提交于 2020-02-05 04:23:05
问题 So my question is this: If i have a server running Kafka (And zookeeper), and another machine only consuming messages, does the consumer machine need to run zookeeper too? Or does the server take care of all? 回答1: No. Role of Zookeeper in Kafka is: Broker registration: (cluster membership) with heartbeats mechanism to keep the list current Storing topic configuration: which topics exist, how many partitions each has, where are the replicas, who is the preferred leader, list of ISR for