spring-kafka

How to skip corrupt (non-serializable) messages in Spring Kafka Consumer?

烈酒焚心 提交于 2020-01-21 13:55:53
问题 This question is for Spring Kafka, related to Apache Kafka with High Level Consumer: Skip corrupted messages Is there a way to configure Spring Kafka consumer to skip a record that cannot be read/processed (is corrupt)? I am seeing a situation where the consumer gets stuck on the same record if it cannot be deserialized. This is the error the consumer throws. Caused by: com.fasterxml.jackson.databind.JsonMappingException: Can not construct instance of java.time.LocalDate: no long/Long

Reading the same message several times from Kafka

给你一囗甜甜゛ 提交于 2020-01-21 11:55:06
问题 I use Spring Kafka API to implement Kafka consumer with manual offset management: @KafkaListener(topics = "some_topic") public void onMessage(@Payload Message message, Acknowledgment acknowledgment) { if (someCondition) { acknowledgment.acknowledge(); } } Here, I want the consumer to commit the offset only if someCondition holds. Otherwise the consumer should sleep for some time and read the same message again. Kafka Configuration: @Bean public ConcurrentKafkaListenerContainerFactory<String,

Deserializing different JSON payload from same Kafka topic with Spring Kafka

≡放荡痞女 提交于 2020-01-21 05:45:07
问题 I'm trying to deserialize different JSON payloads from the same Kafka topic. The other questions asked here, guided me to a first attempt, but I was not able to get it running. As Gary mentioned (here) there is some hint ( JsonSerializer.ADD_TYPE_INFO_HEADERS ), but when I send and receive both messages I get an exception. org.springframework.kafka.listener.ListenerExecutionFailedException: Listener method could not be invoked with the incoming message Endpoint handler details: Method [public

How to use “Kafka Streams Binder” with “Functional Style” and DI?

徘徊边缘 提交于 2020-01-16 18:20:14
问题 https://cloud.spring.io/spring-cloud-static/spring-cloud-stream-binder-kafka/3.0.0.M3/reference/html/spring-cloud-stream-binder-kafka.html#_programming_model shows an example where the input topic can be set using the property spring.cloud.stream.bindings.process_in.destination . Now I want to use dependency injection, e.g. @Bean public java.util.function.Consumer<KStream<Object, String>> process(JavaMailSender mailSender) {...} When starting the application (based on Spring Boot) the

Getting- Magic v1 does not support record headers, when producing message

岁酱吖の 提交于 2020-01-16 18:06:32
问题 Getting Magic v1 does not support record headers, while producing message,Below my code, KafkaProducerConfig: @Configuration public class KafkaProducerConfig { @Value(value = "${kafka.bootstrap-servers}") private String bootstrapAddress; @Bean public ProducerFactory<String, Event> producerFactory() { Map<String, Object> config = new HashMap<>(); config.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress); config.put(JsonSerializer.ADD_TYPE_INFO_HEADERS, false); config.put

Kafka Cannot Configure Topics on Application Startup, but Later Can Communicate

纵饮孤独 提交于 2020-01-16 16:29:09
问题 We have a spring boot application using spring-kafka (2.2.5.RELEASE) that always gets this error when starting up: Could not configure topics org.springframework.kafka.KafkaException: Timed out waiting to get existing topics; nested exception is java.util.concurrent.TimeoutException However, the application continues to startup: org.springframework.kafka.KafkaListenerEndpointContainer#0-0-C-1] INFO o.s.k.l.KafkaMessageListenerContainer - partitions revoked: [] INFO o.s.k.l

Is there a way to get the last message from Kafka topic?

回眸只為那壹抹淺笑 提交于 2020-01-15 10:16:36
问题 I have a Kafka topic with multiple partitions and I wonder if there is a way in Java to fetch the last message for the topic. I don't care for the partitions I just want to get the latest message. I have tried @KafkaListener but it fetches the message only when the topic is updated. If there is nothing published after the application is opened nothing is returned. Maybe the listener is not the right approach to the problem at all? 回答1: This following snippet worked for me. You may try this.

Is there a way to get the last message from Kafka topic?

梦想的初衷 提交于 2020-01-15 10:15:53
问题 I have a Kafka topic with multiple partitions and I wonder if there is a way in Java to fetch the last message for the topic. I don't care for the partitions I just want to get the latest message. I have tried @KafkaListener but it fetches the message only when the topic is updated. If there is nothing published after the application is opened nothing is returned. Maybe the listener is not the right approach to the problem at all? 回答1: This following snippet worked for me. You may try this.

Spring Boot Kafka Consumer not consuming, Kafka Listener not triggering

妖精的绣舞 提交于 2020-01-15 10:07:24
问题 I am trying to build a simple spring boot Kafka Consumer to consume messages from a kafka topic, however no messages get consumed as the KafkaListener method is not getting triggered. I saw in other answers to make sure that AUTO_OFFSET_RESET_CONFIG is set to "earliest" and that the GROUP_ID_CONFIG is unique which I did, however still the KafkaListenerMethod is not triggering. The application simply starts and doesn't do anything: Application Started . ____ _ __ _ _ /\\ / ___'_ __ _ _(_)_ __

How to use Micrometer Timer to record duration of async method (returns Mono or Flux)

折月煮酒 提交于 2020-01-14 10:13:28
问题 I'd like to use Micrometer to record the execution time of an async method when it eventually happens. Is there a recommended way to do this? Example: Kafka Replying Template. I want to record the time it takes to actually execute the sendAndReceive call (sends a message on a request topic and receives a response on a reply topic). public Mono<String> sendRequest(Mono<String> request) { return request .map(r -> new ProducerRecord<String, String>(requestsTopic, r)) .map(pr -> { pr.headers()