kafka-consumer-api

Kafka how to read from __consumer_offsets topic

非 Y 不嫁゛ 提交于 2019-11-27 03:58:56
I'm trying to find out which offsets my current High-Level consumers are working off. I use Kafka 0.8.2.1, with no "offset.storage" set in the server.properties of Kafka - which, I think, means that offsets are stored in Kafka. (I also verified that no offsets are stored in Zookeeper by checking this path in the Zk shell: /consumers/consumer_group_name/offsets/topic_name/partition_number ) I tried to listen to the __consumer_offsets topic to see which consumer saves what value of offsets, but it did not work... I tried the following: created a config file for console consumer as following: =>

How to write a file to Kafka Producer

微笑、不失礼 提交于 2019-11-27 00:22:12
问题 I am trying to load a simple text file instead of standard input in Kafka. After downloading Kafka, I performed the following steps: Started zookeeper: bin/zookeeper-server-start.sh config/zookeeper.properties Started Server bin/kafka-server-start.sh config/server.properties Created a topic named "test": bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic test Ran the Producer: bin/kafka-console-producer.sh --broker-list localhost:9092 --topic

KafkaConsumer 0.10 Java API error message: No current assignment for partition

蓝咒 提交于 2019-11-26 22:57:30
问题 I am using KafkaConsumer 0.10 Java api. I want to consume from a specific partition and specific offset. I looked up and found that there is a seek method but its throwing an exception. Anyone had a similar use case or solution ? Code: KafkaConsumer<String, byte[]> consumer = new KafkaConsumer<>(consumerProps); consumer.seek(new TopicPartition("mytopic", 1), 4); Exception java.lang.IllegalStateException: No current assignment for partition mytopic-1 at org.apache.kafka.clients.consumer

Kafka consumer not returning any events

北慕城南 提交于 2019-11-26 22:11:55
问题 The below Scala kafka consumer is not returning any events from the poll call. However, the topic is correct, and I can see events being sent to the topic using the console consumer: /opt/kafka_2.11-0.10.1.0/bin/kafka-console-consumer.sh --bootstrap-server kafka:9092 --topic my_topic --from-beginning I also see the topic in my Scala code sample below when I step through it with a debugger and invoke kafkaConsumer.listTopics() Also, this is called from a single unit test, so I'm only creating

How to read data using Kafka Consumer API from beginning?

荒凉一梦 提交于 2019-11-26 19:57:08
问题 Please can anyone tell me how to read messages using the Kafka Consumer API from the beginning every time when I run the consumer jar. 回答1: This works with the 0.9.x consumer. Basically when you create a consumer, you need to assign a consumer group id to this consumer using the property ConsumerConfig.GROUP_ID_CONFIG . Generate the consumer group id randomly every time you start the consumer doing something like this properties.put(ConsumerConfig.GROUP_ID_CONFIG, UUID.randomUUID().toString()

Understanding Kafka Topics and Partitions

孤街浪徒 提交于 2019-11-26 18:43:01
问题 I am starting to learn Kafka for enterprise solution purposes. During my readings, some questions came to my mind: When a producer is producing a message - it will specify the topic it wants to send the message to, is that right? Does it care about partitions? When a subscriber is running - does it specify its group id so that it can be part of a cluster of consumers of the same topic, or several topics that this group of consumers is interested in? Does each consumer group have a

Delete message after consuming it in KAFKA

﹥>﹥吖頭↗ 提交于 2019-11-26 11:02:42
问题 I am using apache kafka to produce and consume a file 5GB in size. I want to know if there is a way where the message from the topic is automatically removed after it is consumed. Do I have any way to keep track of consumed messages? I don\'t want to delete it manually. 回答1: In Kafka, the responsibility of what has been consumed is the responsibility of the consumer and this is also one of the main reasons why Kafka has such great horizontal scalability. Using the high level consumer API will