reactor-kafka

Using onErrorResume to handle problematic payloads posted to Kafka using Reactor Kafka

我只是一个虾纸丫 提交于 2021-02-07 19:23:40
问题 I am using reactor kafka to send in kafka messages and receive and process them. While receiving the kakfa payload, I do some deserialization, and if there is an exception, I want to just log that payload ( by saving to mongo ), and then continue receiving other payloads. For this I am using the below approach - @EventListener(ApplicationStartedEvent.class) public void kafkaReceiving() { for(Flux<ReceiverRecord<String, Object>> flux: kafkaService.getFluxReceives()) { flux.delayUntil(//some

Using onErrorResume to handle problematic payloads posted to Kafka using Reactor Kafka

安稳与你 提交于 2021-02-07 19:21:29
问题 I am using reactor kafka to send in kafka messages and receive and process them. While receiving the kakfa payload, I do some deserialization, and if there is an exception, I want to just log that payload ( by saving to mongo ), and then continue receiving other payloads. For this I am using the below approach - @EventListener(ApplicationStartedEvent.class) public void kafkaReceiving() { for(Flux<ReceiverRecord<String, Object>> flux: kafkaService.getFluxReceives()) { flux.delayUntil(//some

Continue consuming subsequent records in reactor kafka after deserialization exception

半腔热情 提交于 2021-01-28 19:50:12
问题 I am using reactor kafka and have a custom AvroDeserializer class for deserialization of messages. Now I have a case where for certain payloads the deserialization class throws an exception. My Kafka listener dies as soon as it tries to read such records. I tried handling this exception using onErrorReturn and using combination of ( doOnError and onErrorContinue ), however, it helped log the exception, but failed to consume subsequent records. public class AvroDeserializer<T extends