kafka-consumer-api

List Kafka Topics via Spring-Kafka

旧时模样 提交于 2019-12-08 18:19:41
问题 We would like to list all Kafka topics via spring-kafka to get results similar to the kafka command: bin/kafka-topics.sh --list --zookeeper localhost:2181 When running the getTopics() method in the service below, we get org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching topic metadata Configuration: @EnableKafka @Configuration public class KafkaConfig { @Bean public ConsumerFactory<String, String> consumerFactory() { Map<String, Object> props = new HashMap<>();

spring boot kafka consumer - how to properly consume kafka messages from spring boot

梦想与她 提交于 2019-12-08 17:47:30
I'm developing a spring boot application, which suppose to consume kafka messages. I'm having a strange outcomes: when I send messages using kafka-console-producer.sh, my consumer only detects and prints avery other message. For example - in the kafka console producer, I would type "one" -> Enter ->"two" -> Enter -> "three" -> Enter. In my spring boot consumer I would only see "two", "four", etc... my ConsumeConfigFactory.java: import java.util.Properties; import javax.annotation.PostConstruct; import kafka.consumer.ConsumerConfig; import org.springframework.stereotype.Component; @Component

If my producer producing, then why the consumer couldn't consume? it stuck @ poll()

大憨熊 提交于 2019-12-08 12:30:43
问题 Im publishing to the remote kafka server and try to consume messages from that remote server. (Kafka v 0.90.1) Publishing works fine but nor the consuming. Publisher package org.test; import java.io.IOException; import java.util.Properties; import org.apache.kafka.clients.producer.KafkaProducer; import org.apache.kafka.clients.producer.ProducerRecord; public class Producer { private void generateMessgaes() throws IOException { String topic = "MY_TOPIC"; Properties props = new Properties();

Kafka + Spring Batch Listener Flush Batch

别来无恙 提交于 2019-12-08 11:33:45
问题 Using Kafka Broker: 1.0.1 spring-kafka: 2.1.6.RELEASE I'm using a batched consumer with the following settings: // Other settings are not shown.. props.put(ConsumerConfig.MAX_POLL_RECORDS_CONFIG, "100"); I use spring listener in the following way: @KafkaListener(topics = "${topics}", groupId = "${consumer.group.id}") public void receive(final List<String> data, @Header(KafkaHeaders.RECEIVED_PARTITION_ID) final List<Integer> partitions, @Header(KafkaHeaders.RECEIVED_TOPIC) Set<String> topics,

How to Set spoutconfig from default setting?

吃可爱长大的小学妹 提交于 2019-12-08 11:29:55
问题 I'm Trying to get the fb pages data using graph api. The size each post is more than 1MB where kafka default fetch.message is 1MB. I have changed the kafka properties from 1MB to 3MB by adding the below lines in kafa consumer.properties and server.properties file. fetch.message.max.bytes=3048576 (consumer.properties) file message.max.bytes=3048576 (server.properties) replica.fetch.max.bytes=3048576 (server.properties ) Now after adding the above lines in Kafka, 3MB message data is going into

Why commitAsync fails to commit the first 2 offsets

烈酒焚心 提交于 2019-12-08 11:29:30
问题 I faced a weird problem at which the consumer can not make comitAsync the first 2 offsets of the log and i don't know the reason. It is very weird because the other messages at the same asynchronous send of the producer received and commited succesfuly by the consumer .Can someone find the source of this problem.. I quote my code below and an output example package com.panos.example; import kafka.utils.ShutdownableThread; import org.apache.kafka.clients.consumer.*; import org.apache.kafka

what is topic count parameter for Kafka ConsumerConnector

早过忘川 提交于 2019-12-08 10:38:17
问题 I am new to apache kafka, and try with the examples given. The following code snippet is used to initialize a ConsumerConnector, I am confused by the topic count parameter; it seems it will cause kafka hands out corresponding number of streams for that topic. however, I tried several times, only the first stream produces messages. So, two questions: 1. how can I determine the count number for a topic? 2. how does the messages split cross over the streams? thanks in advance. Map<String,

How to fetch recent messages from Kafka topic

回眸只為那壹抹淺笑 提交于 2019-12-08 10:23:39
问题 Do we have any option like fetching recent 10/20/ etc., messages from Kafka topic. I can see --from-beginning option to fetch all messages from the topic but if I want to fetch only few messages first, last, middle or latest 10. do we have some options? 回答1: First N messages You can use --max-messages N in order to fetch the first N messages of a topic. For example, to get the first 10 messages, run bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic test --from-beginning

Cannot consume from a topic

时光毁灭记忆、已成空白 提交于 2019-12-08 08:14:57
问题 When i try to consume messages from the kafka server which is hosted in ec2 with kafka console tool (V 0.9.0.1 , i think this uses old consumer APIs) I get following exception. How can i overcome this? kafka-console-consumer.sh --zookeeper zookeeper.xx.com:2181 --topic my-replicated-topic [2016-04-06 15:52:40,247] WARN [console-consumer-12572_Rathas-MacBook-Pro.local-1459921957380-6ebc238f-leader-finder-thread], Failed to add leader for partitions [my-replicated-topic,1],[my-replicated-topic

How to pass topics dynamically to a kafka listener?

对着背影说爱祢 提交于 2019-12-08 06:43:05
问题 From a couple of days I'm trying out ways to dynamically pass topics to Kafka listener rather than using them through keys from a Java DSL. Anyone around done this before or could throw some light on what is the best way to achieve this? 回答1: You cannot "dynamically pass topics to Kafka listener "; you have to programmatically create a listener container instead. 回答2: Here is a working solution: // Start brokers without using the "@KafkaListener" annotation Map<String, Object> consumerProps =