spring-kafka

Deserialize kafka messages in KafkaConsumer using springboot

二次信任 提交于 2020-04-16 04:12:01
问题 I have a springboot app that listen kafka messages and convert them to object @KafkaListener(topics = "test", groupId = "group_id") public void consume(String message) throws IOException { ObjectMapper objectMapper = new ObjectMapper(); Hostel hostel = objectMapper.readValue(message, Hostel.class); } I woder if it is possible to do ti directly @KafkaListener(topics = "test", groupId = "group_id") public void consume(Hostel hostel) throws IOException { } 回答1: You can do it using spring-kafka .

Spring Kafka KafkaTemplate.flush() required?

泄露秘密 提交于 2020-04-16 03:33:28
问题 I am using Spring kafka for the first time and I have created a Producer and Consumer by using spring kafka. I have my kafka server running on localhost and have created a topic called test. I was not able to send messages to the consumer by simply calling KafkaTemplate.send(topicName,Data); I had to call flush() on kafkaTemplate after calling send on the same object and then the consumer was able to receive the data. Okay it works and it is fantastic. But could anyone explain to me what is

when to use RecoveryCallback vs KafkaListenerErrorHandler

好久不见. 提交于 2020-04-16 02:16:05
问题 I'm trying to understand when should i use org.springframework.retry.RecoveryCallback and org.springframework.kafka.listener.KafkaListenerErrorHandler? As of today, I'm using a class (implements org.springframework.retry.RecoveryCallback) to log error message and send the message to DLT and it's working. For sending a message to DLT, I'm using Spring KafkaTemplate and then I came across KafkaListenerErrorHandler and DeadLetterPublishingRecoverer. Now, can you please suggest me, how should i

Unable to connect to Kafka run in container from Spring Boot app run outside container

拟墨画扇 提交于 2020-04-11 05:25:24
问题 I'm running kafka locally via: docker-compose.yml zookeeper: image: 'bitnami/zookeeper:latest' ports: - 2181:2181 environment: - ALLOW_ANONYMOUS_LOGIN=yes kafka: image: 'bitnami/kafka:latest' ports: - 9092:9092 environment: - KAFKA_ZOOKEEPER_CONNECT=zookeeper:2181 - ALLOW_PLAINTEXT_LISTENER=yes - KAFKA_ADVERTISED_PORT=9092 - KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://localhost:9092 My Spring Boot application is run with: application.yml: spring: application: name: testkafka kafka: bootstrap

Filter messages before deserialization based on headers

喜欢而已 提交于 2020-03-24 00:00:35
问题 Sometimes messages can be filtered out before the deserialization based on header values . Are there any existing patterns for this scenario using spring kafka. I am thinking implementing similar to ErrorHandlingDeserializer in addition to delegate take filter predicate also as property. Any suggestions? thanks. 回答1: Yes, you can use the same technique used by the ErrorHandlingDeserializer2 (which replaces the ErrorHandlingDeserializer ) to return a "marker" object instead of doing the

How to retry with spring kafka version 2..2

拥有回忆 提交于 2020-03-05 00:26:59
问题 Just trying to find out a simple example with spring-kafka 2.2 that works with a KafkaListener, to retry last failed message. If a message fails, the message should be redirected to another Topic where the retries attempts will be made. We will have 4 topics. topic , retryTopic , sucessTopic and errorTopic If topic fails, should be redirected to retryTopic where the 3 attempts to retry will be made. If those attempts fails, must redirect to errorTopic . In case of sucess on both topic and

spring kafka template producer performance

一笑奈何 提交于 2020-03-03 05:49:42
问题 I am using Spring Kafka template for producing messages. And the rate at which it is producing the messages is too slow. Takes around 8 mins for producing 15000 messages. Following is How I created the Kafka template: @Bean public ProducerFactory<String, GenericRecord> highSpeedAvroProducerFactory( @Qualifier("highSpeedProducerProperties") KafkaProperties properties) { final Map<String, Object> kafkaPropertiesMap = properties.getKafkaPropertiesMap(); System.out.println(kafkaPropertiesMap);

Kafka transaction failed but commits offset anyway

烂漫一生 提交于 2020-02-28 05:53:07
问题 I am trying to wrap my head around Kafka transactions and exactly-once. I have created a transactional consumer and I want to ensure that I read and process all messages for a topic. Kafka still commits the offset if the transaction fails and the message is therefore lost. More formally, if a stream processing application consumes message A and produces message B such that B = F(A), then exactly once processing means that A is considered consumed if and only if B is successfully produced, and

Kafka transaction failed but commits offset anyway

断了今生、忘了曾经 提交于 2020-02-28 05:52:22
问题 I am trying to wrap my head around Kafka transactions and exactly-once. I have created a transactional consumer and I want to ensure that I read and process all messages for a topic. Kafka still commits the offset if the transaction fails and the message is therefore lost. More formally, if a stream processing application consumes message A and produces message B such that B = F(A), then exactly once processing means that A is considered consumed if and only if B is successfully produced, and

How to pass dynamic topic name to @kafkalistener(topics from environment variable

烂漫一生 提交于 2020-02-25 09:38:32
问题 I AM WRITING A KAFKA CONSUMER I want to pass the environment variable topic name to @kafkalistener(topics = topic import org.springframework.beans.factory.annotation.Autowired; import org.springframework.kafka.annotation.KafkaListener; import org.springframework.stereotype.Service; @Service public class KafkaConsumer { @Autowired private EnvProperties envProperties; private final String topic = envProperties.getTopic(); @KafkaListener(topics = "#{'${envProperties.getTopic()}'}", groupId =