kafka-consumer-api

Is kafka consumer 0.10 backwards compatible?

五迷三道 提交于 2019-12-22 18:04:21
问题 Is the kafka consumer 0.10 compatible with 0.9 broker? If I'm not mistaken the 0.9 consumer is still considered beta whereas 0.10 is stable, right? That's why I'm interested in using the 0.10 version but my broker version is 0.9 and I wouldn't like to upgrade that yet. 回答1: If you want to use 0.10 clients you need to upgrade your cluster to 0.10. Kafka is backward compatible with regards to clients but not forward compatible. That is, a 0.9 client can use a 0.10 cluster but a 0.10 client can

how to get the all messages in a topic from kafka server

邮差的信 提交于 2019-12-22 10:11:12
问题 I would like to get all the messages from beginning in a topic from server. Ex: bin/kafka-console-consumer.sh --zookeeper localhost:2181 --topic testTopic --from-beginning When using the above console command, I would like to able to get all messages in a topic from the beginning but I couldn't consume all the messages in a topic from beginning using java code. 回答1: The easiest way would be to start a consumer and drain all the messages. Now I don't know how many partitions you have in your

Kafka pattern subscription. Rebalancing is not being triggered on new topic

℡╲_俬逩灬. 提交于 2019-12-22 06:47:23
问题 According to the documentation on kafka javadocs if I: Subscribe to a pattern Create a topic that matches the pattern A rebalance should occur, which makes the consumer read from that new topic. But that's not happening. If I stop and start the consumer, it does pick up the new topic. So I know the new topic matches the pattern. There's a possible duplicate of this question in https://stackoverflow.com/questions/37120537/whitelist-filter-in-kafka-doesnt-pick-up-new-topics but that question

How does (should) Kafka Consumer cope with Poison Messages

寵の児 提交于 2019-12-22 05:37:10
问题 When a Kafka Consumer fails to deserialize a message, is it the client applications responsibility to deal with the Poison Message? Or Does Kafka "increment" the message offset and continue consumption of valid messages? Is there a "Best Practice" for dealing with Poison Messages held on Kafka topics? 回答1: When Kafka is unable to deserialize the record the consumer will receive a org.apache.kafka.common.KafkaException , you should commit the offset yourself and keep consuming. 来源: https:/

Python kafka consumer group id issue

喜欢而已 提交于 2019-12-22 01:18:07
问题 AFAIK, The concept of partitions and (consumer) groups in kafka was introduced to implement parallelism. I am working with kafka through python. I have a certain topic, which has (say) 2 partitions. This means, if I start a consumer group with 2 consumers in it, they will be mapped(subscribed) to different partitions. But, using kafka library in python, I came across a weird issue. I started 2 consumers with essentially the same group-ids, and started the threads for them to consume messages.

multiprocessing in kafka-python

佐手、 提交于 2019-12-21 12:27:42
问题 I have been using the python-kaka module to consume from a kafka broker. I want to consume from the same topic with 'x' number of partitions in parallel. The documentation has this : # Use multiple consumers in parallel w/ 0.9 kafka brokers # typically you would run each on a different server / process / CPU consumer1 = KafkaConsumer('my-topic', group_id='my-group', bootstrap_servers='my.server.com') consumer2 = KafkaConsumer('my-topic', group_id='my-group', bootstrap_servers='my.server.com')

Simple Kafka Consumer not receiving messages

假如想象 提交于 2019-12-21 05:35:11
问题 I am a newbie to Kafka and running a simple kafka consumer/producer example as given on KafkaConsumer and KafkaProducer. When I am running consumer from terminal, consumer is receiving messages but I am not able to listen using Java code. I have searched for similar issues on StackoverFlow also (Links: Link1, Link2) and tried that solutions but nothing seems to be working for me. Kafka Version: kafka_2.10-0.10.2.1 and corresponding maven dependency is used in pom. Java Code for producer and

Kafka multiple consumers for a partition

北慕城南 提交于 2019-12-21 04:04:35
问题 I have a producer which writes messages to a topic/partition. To maintain ordering, i would like to go with single partition and I want 12 consumers to read all the messages from this single partition(no consumer group, all the messages should go to all consumers). Is this achievable? I read some forums that only one consumer can read per partition. 回答1: You may use SimpleConsumer to achieve exactly what you are asking - no consumer groups, all consumers can read a single partition. However

Why Kafka consumer performance is slow?

白昼怎懂夜的黑 提交于 2019-12-21 02:51:26
问题 I have one simple topic, and one simple Kafka consumer and producer, using the default configuration. The program is very simple, I have two threads. In the producer, it keeps sending 16 bytes data. And in consumer side, it keeps receiving. I found the fact that, the throughput for producer is roughly 10MB/s, that is fine. But the throughput for consumer is only 0.2MB/s. I have disabled all the debugging logs but that does not make it any better. The test is running on local machine. Any body

Apache Kafka and Avro: org.apache.avro.generic.GenericData$Record cannot be cast to com.harmeetsingh13.java.Customer

主宰稳场 提交于 2019-12-21 02:51:17
问题 Whenever I am trying to read the message from kafka queue, I am getting following exception : [error] (run-main-0) java.lang.ClassCastException: org.apache.avro.generic.GenericData$Record cannot be cast to com.harmeetsingh13.java.Customer java.lang.ClassCastException: org.apache.avro.generic.GenericData$Record cannot be cast to com.harmeetsingh13.java.Customer at com.harmeetsingh13.java.consumers.avrodesrializer.AvroSpecificDeserializer.infiniteConsumer(AvroSpecificDeserializer.java:79) at