spring-kafka

Handling exceptions in Kafka streams

爱⌒轻易说出口 提交于 2019-11-30 07:52:45
Had gone through multiple posts but most of them are related handling Bad messages not about exception handling while processing them. I want to know to how to handle the messages that is been received by the stream application and there is an exception while processing the message? The exception could be because of multiple reasons like Network failure, RuntimeException etc., Could someone suggest what is the right way to do? Should I use setUncaughtExceptionHandler ? or is there a better way? How to handle retries? Thanks in advance!! it depends what do you want to do with exceptions on

Simple embedded Kafka test example with spring boot

会有一股神秘感。 提交于 2019-11-30 01:47:51
Edit FYI: working gitHub example I was searching the internet and couldn't find a working and simple example of an embedded Kafka test. My setup is: Spring boot Multiple @KafkaListener with different topics in one class Embedded Kafka for test which is starting fine Test with Kafkatemplate which is sending to topic but the @KafkaListener methods are not receiving anything even after a huge sleep time No warnings or errors are shown, only info spam from Kafka in logs Please help me. There are mostly over configured or overengineered examples. I am sure it can be done simple. Thanks, guys!

Spring-Kafka : Issue while deserialising kafka message - class not in a “trusted package”?

主宰稳场 提交于 2019-11-29 12:39:37
I get the below exception because I produce from one project and the consumer consumes from another project. How can I fix this. Obviously the packages are not the same. So how can I ensure that there is proper json serialization. The class 'com.lte.assessment.assessments.AssessmentAttemptRequest' is not in the trusted packages: [java.util, java.lang, com.lte.assessmentanalytics.model Consumer Config @EnableKafka @Configuration public class KafkaConfig { static Map<String, Object> config = new HashMap(); static { config.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "127.0.0.1:9092"); config.put

Handling exceptions in Kafka streams

与世无争的帅哥 提交于 2019-11-29 10:12:35
问题 Had gone through multiple posts but most of them are related handling Bad messages not about exception handling while processing them. I want to know to how to handle the messages that is been received by the stream application and there is an exception while processing the message? The exception could be because of multiple reasons like Network failure, RuntimeException etc., Could someone suggest what is the right way to do? Should I use setUncaughtExceptionHandler ? or is there a better

Why does a Kafka consumer take a long time to start consuming?

喜你入骨 提交于 2019-11-29 07:19:02
问题 We start a Kafka consumer, listening on a topic which may not yet be created (topic auto creation is enabled though). Not long thereafter a producer is publishing messages on that topic. However, it takes some time for the consumer to notice this: 5 minutes to be exact. At this point the consumer revokes its partitions and rejoins the consumer group. Kafka re-stabilizes the group. Looking at the time-stamps of the consumer vs. kafka logs, this process is initiated at the consumer side. I

Simple embedded Kafka test example with spring boot

家住魔仙堡 提交于 2019-11-28 20:16:33
问题 Edit FYI: working gitHub example I was searching the internet and couldn't find a working and simple example of an embedded Kafka test. My setup is: Spring boot Multiple @KafkaListener with different topics in one class Embedded Kafka for test which is starting fine Test with Kafkatemplate which is sending to topic but the @KafkaListener methods are not receiving anything even after a huge sleep time No warnings or errors are shown, only info spam from Kafka in logs Please help me. There are

Spring-Kafka : Issue while deserialising kafka message - class not in a “trusted package”?

不羁岁月 提交于 2019-11-28 06:45:10
问题 I get the below exception because I produce from one project and the consumer consumes from another project. How can I fix this. Obviously the packages are not the same. So how can I ensure that there is proper json serialization. The class 'com.lte.assessment.assessments.AssessmentAttemptRequest' is not in the trusted packages: [java.util, java.lang, com.lte.assessmentanalytics.model Consumer Config @EnableKafka @Configuration public class KafkaConfig { static Map<String, Object> config =

Apache Kafka: Replay messages in a topic

微笑、不失礼 提交于 2019-11-28 03:17:41
问题 I'm considering using Apache Kafka as an event store for storing events within a microservice. One thing that I read through various blogs is that Kafka can be considered to be a single source of truth, where Kafka log will store all the events for a given topic. I was wondering if Kafka has the ability to replay messages since the beginning of time (in case there is a hard drive/network crash that occurs for example)? (note that i see that there are some logs stored in the /tmp folder under

Kafka producer TimeoutException: Expiring 1 record(s)

痞子三分冷 提交于 2019-11-28 02:42:35
问题 I am using Kafka with Spring-boot: Kafka Producer class : @Service public class MyKafkaProducer { @Autowired private KafkaTemplate<String, String> kafkaTemplate; private static Logger LOGGER = LoggerFactory.getLogger(NotificationDispatcherSender.class); // Send Message public void sendMessage(String topicName, String message) throws Exception { LOGGER.debug("========topic Name===== " + topicName + "=========message=======" + message); ListenableFuture<SendResult<String, String>> result =

How to test a ConsumerAwareRebalanceListener?

别等时光非礼了梦想. 提交于 2019-11-27 16:17:27
I developed a @KafkaListener that is also marked with the ConsumerAwareRebalanceListener interface, using Spring Boot 2.0.6. I implemented the onPartitionsAssigned method, in which I rewind the offset of a fixed amount of time, let's say 60 seconds. So far so good. How can I test the above use case using the tools that Spring Kafka gives me? I supposed I need to start a Kafka broker (i.e., an EmbeddedKafka ), then stopping the listener and then rebooting it again, to test that it read again the messages arrived in the last 60 seconds. Can somebody help me? I googled a little, but I didn't find