kafka-consumer-api

how to specify consumer group in Kafka Spark Streaming using direct stream

时光总嘲笑我的痴心妄想 提交于 2020-01-12 08:38:10
问题 How to specify consumer group id for kafka spark streaming using direct stream API. HashMap<String, String> kafkaParams = new HashMap<String, String>(); kafkaParams.put("metadata.broker.list", brokers); kafkaParams.put("auto.offset.reset", "largest"); kafkaParams.put("group.id", "app1"); JavaPairInputDStream<String, String> messages = KafkaUtils.createDirectStream( jssc, String.class, String.class, StringDecoder.class, StringDecoder.class, kafkaParams, topicsSet ); though i have specified the

How to create separate Kafka listener for each topic dynamically in springboot?

百般思念 提交于 2020-01-12 07:39:29
问题 I am new to Spring and Kafka. I am working on a use case [using SpringBoot-kafka] where in users are allowed to create kafka topics at runtime. The spring application is expected to subscribe to these topics pro-grammatically at runtime. What i know so far is that, Kafka listener are design time and hence topics needs to be specified before startup. Is there a way to dynamically subscribe to kafka topics in SpringBoot-Kafka integration? Referred this https://github.com/spring-projects/spring

How to specific a Java Generic class dynamicly

妖精的绣舞 提交于 2020-01-11 11:58:34
问题 If I specific a method which return a generic class,how can I do than I can specific the type of generic class dynamicly ? for example try { Class c =Class.forName(keytype); Class d= Class.forName(valuetype); KafkaConsumer<c,d> consumerconsumer = new KafkaConsumer<c,d>(PropertiesUtil.getPropsObj(configPath)); return consumer ; } catch (ClassNotFoundException e) { e.printStackTrace(); } } But the code above is not OK. How can I do than I can achieve that? 回答1: Generic syntax is good at compile

Kafka configuration min.insync.replicas not working

∥☆過路亽.° 提交于 2020-01-06 08:11:40
问题 Its my early days in learning kafka. And I am checking out every kafka property/concept in my local machine. So I came across this property min.insync.replicas and here is my understanding. Please correct me if I've misunderstood anything. Once a message is sent to a topic, the message must be written to at least min.insync.replicas number of followers. min.insync.replicas also includes the leader. If number of available live brokers( indirectly, in sync replicas ) are less than the specified

Connect CometD client with Kafka producer

对着背影说爱祢 提交于 2020-01-06 03:18:14
问题 Is it possible to connect cometD client with Kafka producer? Any suggestions? Currently I am having a CometD client in python which is extracting data real time from a Salesforce object. Now I want to push that data into Kafka producer. Is it possible to do that? And how? 回答1: Solved. By using https://github.com/dkmadigan/python-bayeux-client to extract the events from Salesforce, I was able to push into the Kafka broker. 来源: https://stackoverflow.com/questions/50615641/connect-cometd-client

Why doesn't offset get updated when messages are consumed in Kafka

家住魔仙堡 提交于 2020-01-05 07:34:19
问题 I am implementing Kafka consumer class to receive messages. I wanted to only get the new messages every time. Therefore, I set enable.auto.commit true. However the offset does not seem to change at all. Even though the topic, consumer group and partition has been always the same. Here is my consumer code: consumerConfig.put("bootstrap.servers", bootstrap); consumerConfig.put("group.id", KafkaTestConstants.KAFKA_GROUP); consumerConfig.put("enable.auto.commit", "true"); consumerConfig.put("auto

Why does Kafka Consumer keep receiving the same messages (offset)

守給你的承諾、 提交于 2020-01-04 06:09:46
问题 I have a SOAP Web Service that sends a kafka request message and waits for a kafka response message (e.g. consumer.poll(10000)). Each time the web service is called it creates a new Kafka Producer and a new Kafka Consumer. every time I call the web service the consumer receives the same messages (e.g. messages with the same offset). I am using Kafka 0.9 and have auto commit enabled and a auto commit frequency of 100 ms. for each ConsumerRecord returned by the poll() method I process within

How does Consumer.endOffsets work in Kafka?

无人久伴 提交于 2020-01-03 17:17:36
问题 Assume I've a timer task running indefinitely which iterates over the all the consumer groups in the kafka cluster and outputs lag, committed offset and end offset for all partitions for each group. Similar to how Kafka console consumer group script works except it's for all groups. Something like Single Consumer - Not Working - Doesn't return offsets for some of the provided topic partitions ( ex. 10 provided - 5 Offsets Returned ) Consumer consumer; static { consumer = createConsumer(); }

kafka-console-consumer custom deserializer

99封情书 提交于 2020-01-03 17:01:33
问题 I would like to use my custom value.deserializer when using the kafka-console-consumer command line tool. Something like this: ./kafka-console-consumer --bootstrap-server kafka2:29092 \ --property value.deserializer=My.Custom.KafkaDeserializer \ --topic TEST But its unable to find my custom class... Exception in thread "main" java.lang.ClassNotFoundException: My.Custom.KafkaDeserializer How can I reference the appropriate jar file so that the script will recognize it? 回答1: As already said

spring boot kafka consumer - how to properly consume kafka messages from spring boot

牧云@^-^@ 提交于 2020-01-03 01:55:52
问题 I'm developing a spring boot application, which suppose to consume kafka messages. I'm having a strange outcomes: when I send messages using kafka-console-producer.sh, my consumer only detects and prints avery other message. For example - in the kafka console producer, I would type "one" -> Enter ->"two" -> Enter -> "three" -> Enter. In my spring boot consumer I would only see "two", "four", etc... my ConsumeConfigFactory.java: import java.util.Properties; import javax.annotation