kafka-consumer-api

Kafka streams filtering: broker or consumer side?

故事扮演 提交于 2019-12-01 11:40:01
问题 I am looking into kafka streams. I want to filter my stream, using a filter with very low selectivity (one in few thousands). I was looking at this method: https://kafka.apache.org/0100/javadoc/org/apache/kafka/streams/kstream/KStream.html#filter(org.apache.kafka.streams.kstream.Predicate) But I can't find any evidence, if the filter will be evaluated by consumer (I really do not want to transfer a lot of GB to consumer, just to throw them away), or inside the broker (yay!). If its evaluated

Spring kafka consumer, seek offset at runtime?

依然范特西╮ 提交于 2019-12-01 05:37:04
I am using the KafkaMessageListenerContainer for consuming from the kafka topic, I have an application logic to process each record which is dependent on other micro services as well. I am now manually committing the offset after each record is processed. But if I the application logic fails I need to seek to the failed offset and keep processing it until it's succeeds. For that I need to do a run time manual seek of the last offset. Is this possible with the KafkaMessageListenerContainer yet ? See Seeking to a Specific Offset . In order to seek, your listener must implement ConsumerSeekAware

Docker Kafka w/ Python consumer

雨燕双飞 提交于 2019-12-01 05:32:36
问题 I am using dockerized Kafka and written one Kafka consumer program. It works perfectly when I run Kafka in docker and application at my local machine. But when I configured the local application in docker I am facing issues. The issue may be due to a topic not created until time application started. docker-compose.yml version: '3' services: zookeeper: image: wurstmeister/zookeeper ports: - "2181:2181" kafka: image: wurstmeister/kafka ports: - "9092:9092" environment: KAFKA_ADVERTISED_HOST

Kafka Streams does not increment offset by 1 when producing to topic

帅比萌擦擦* 提交于 2019-12-01 05:18:36
问题 I have implemented a simple Kafka Dead letter record processor. It works perfectly when using records produced from the Console producer. However I find that our Kafka Streams applications do not guarantee that producing records to the sink topics that the offsets will be incremented by 1 for each record produced. Dead Letter Processor Background: I have a scenario where records may be received before all data required to process it is published. When records are not matched for processing by

Kafka Consumer does not receive messages

折月煮酒 提交于 2019-12-01 00:20:15
I am a newbie in Kafka. I read many instructions on the Internet to make a Kafka Producer and Kafka Consumer. I did the former successfully which can send messages to Kafka cluster. However, I did not complete with the latter one. Please kindly help me to solve this problem. I saw my problem likes some posts on StackOverflow but I want to describe more clearly. I run Kafka and Zookeeper on Ubuntu server on Virtual Box. Use the simplest configuration (almost defaults) with 1 Kafka cluster and 1 Zookeeper cluster. 1.When I use the command line of Kafka for producer and consumer, like: * Case 1:

Error reading field 'topics': java.nio.BufferUnderflowException in Kafka

[亡魂溺海] 提交于 2019-11-30 23:58:00
9.0 client to consume messages from two brokers which are running on a remote system.My producer is working fine and is able to send messages to the broker but my consumer is not able to consume these messages.Consumer and producer are running on my local system and the two brokers are on aws. Whenever I try to run consumer. Following error appears on the broker logs. ERROR Closing socket for /122.172.17.81 because of error (kafka.network.Processor) org.apache.kafka.common.protocol.types.SchemaException: Error reading field 'topics': java.nio.BufferUnderflowException at org.apache.kafka.common

kafka consumer to dynamically detect topics added

孤街浪徒 提交于 2019-11-30 22:30:55
I'm using KafkaConsumer to consume messages from Kafka server (topics).. It works fine for topics created before starting Consumer code... But the problem is, it will not work if the topics created dynamically(i mean to say after consumer code started), but the API says it will support dynamic topic creation.. Here is the link for your reference.. Kafka version used : 0.9.0.1 https://kafka.apache.org/090/javadoc/index.html?org/apache/kafka/clients/consumer/KafkaConsumer.html Here is the JAVA code... Properties props = new Properties(); props.put("bootstrap.servers", "localhost:9092"); props

Kafka-consumer. commitSync vs commitAsync

余生颓废 提交于 2019-11-30 21:49:50
The quote from https://www.safaribooksonline.com/library/view/kafka-the-definitive/9781491936153/ch04.html#callout_kafka_consumers__reading_data_from_kafka_CO2-1 The drawback is that while commitSync() will retry the commit until it either succeeds or encounters a non-retriable failure, commitAsync() will not retry. This phrase is not clear to me. I suppose that consumer sends commit request to broker and in case if the broker doesn't respond within some timeout it means that the commit failed. Am I wrong? Can you clarify the difference of commitSync and commitAsync in details? Also, please

How to make kafka consumer to read from last consumed offset but not from beginning

这一生的挚爱 提交于 2019-11-30 18:56:47
I am new to kafka and trying to understand if there is a way to read messages from last consumed offset, but not from beginning. I am writing an example case, so that my intention will not get deviate. Eg: 1) I produced 5 messages at 7:00 PM and console consumer consumed those. 2) I stopped consumer at 7:10 PM 3) I produced 10 message at 7:20 PM. No consumer had read those messages. 4) Now, i have started console consumer at 7:30 PM, without from-beginning. 5) Now, it Will read the messages produced after it has started. Not the earlier ones, which were produced at 7.20 PM Is there a way to

Kafka consumer list

孤者浪人 提交于 2019-11-30 12:37:13
问题 I need to find out a way to ask Kafka for a list of topics. I know I can do that using the kafka-topics.sh script included in the bin\ directory. Once I have this list, I need all the consumers per topic. I could not find a script in that directory, nor a class in the kafka-consumer-api library that allows me to do it. The reason behind this is that I need to figure out the difference between the topic's offset and the consumers' offsets. Is there a way to achieve this? Or do I need to