spring-kafka

Exception in Spring kafka deserialization

陌路散爱 提交于 2019-12-24 23:15:03
问题 i am creating custom deserializer for kafka consumer. but i am getting blow exception - 2019-04-05 16:36:51.064 ERROR 13256 --- [ntainer#0-0-C-1] o.s.kafka.listener.LoggingErrorHandler : Error while processing: ConsumerRecord(topic = freshTopic, partition = 0, offset = 229860, CreateTime = 1554462411064, serialized key size = -1, serialized value size = 214, headers = RecordHeaders(headers = [], isReadOnly = false), key = null, value = {"date":null,"deviceAddress":"10.95.251.8","iPAddress":"

Cannot mark DEFAULT_STREAMS_CONFIG_BEAN_NAME as Primary

好久不见. 提交于 2019-12-24 20:32:54
问题 I just upgraded to spring-boot 2.1.3.RELEASE and I cannot have more than one StreamsBuilderFactoryBean because of this new class/ method ( kafkaStreamsFactoryBeanConfigurer requires exactly one factoryBean ): @Configuration @ConditionalOnClass(StreamsBuilder.class) @ConditionalOnBean(name = KafkaStreamsDefaultConfiguration.DEFAULT_STREAMS_BUILDER_BEAN_NAME) class KafkaStreamsAnnotationDrivenConfiguration { //... @Bean public KafkaStreamsFactoryBeanConfigurer kafkaStreamsFactoryBeanConfigurer(

How to catch errors from the Spring Integration error channel inside Spring Cloud Stream?

假装没事ソ 提交于 2019-12-24 20:29:22
问题 I'm trying to create a application-level error handler for failures during the processing inside SCS application using Kafka as a message broker. I know that SCS already provides the DLQ functionality, but in my case I want to wrap failed messages with a custom wrapper type (providing the failure context (source, cause etc.)) In https://github.com/qabbasi/Spring-Cloud-Stream-DLQ-Error-Handling you can see two approaches for this scenario: one is using SCS and the other one directly Spring

How to configure Kafka to repeat uncommitted offset messages?

不羁岁月 提交于 2019-12-24 18:35:05
问题 Suppose I have these messages on my topic: [A, A, B, A, B] A is processed successfully by my application, but B throws an unexpected exception. I thought changing ackOnError to false would not commit offset and thus the listener would repeat processing the same message until Kafka listener process the message without exceptions . I have set enable.auto.commit and ackOnError to false but the listener is still jumping to next message, doesn't matter if message is A or B. How can I accomplish

how to change log levels of 3rd party library in java

浪子不回头ぞ 提交于 2019-12-24 09:05:31
问题 The console logs is cluttered with logs from 3rd part libraries. For example my project uses kafka and zooker keeper client libraries because of this there are too many logs from them 2018-05-08 10:30:38.250 INFO 2968 --- [0:0:0:0:1:2181)] org.apache.zookeeper.ClientCnxn : Opening socket connection to server 0:0:0:0:0:0:0:1/0:0:0:0:0:0:0:1:2181. Will not attempt to authenticate using SASL (unknown error) 2018-05-08 10:30:38.309 INFO 2968 --- [ main] o.a.k.clients.producer.ProducerConfig :

Kafka consumer stopped abruptly after getting exception

心已入冬 提交于 2019-12-24 06:37:25
问题 Below code snippet : KafkaConsumerConfig class public ConsumerFactory<String, String> consumerFactory() { Map<String, Object> props = new HashMap<>(); props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "localhost:9093"); props.put(ConsumerConfig.GROUP_ID_CONFIG, "consumerGroupId"); props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false); props.put(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG, 10000); props.put(ConsumerConfig.MAX_POLL_RECORDS_CONFIG, 10); props.put(ConsumerConfig.MAX_POLL

Spring boot: excluding some autoconfigured beans

馋奶兔 提交于 2019-12-24 02:28:27
问题 I have a Spring boot project that uses spring-kafka. In this project I've built some event driven components that wrap spring-kafka beans (namely KafkaTemplate and ConcurrentKafkaListenerContainer). I want to make this project a reusable library accross a set of Spring boot applications. But when I add dependency to this library from a spring boot app I get an error at app startup: APPLICATION FAILED TO START Description: Parameter 1 of method kafkaListenerContainerFactory in org

Spring Integration Kafka vs Spring Kafka

社会主义新天地 提交于 2019-12-24 01:47:05
问题 We are trying to implement a message channel between a worker spring application and a consumer spring application (there will be replicas of the same consumer on multiple JVMs) With the Java Config there is limited documentation for the spring integration and I was able to find a documentation for the spring Kafka.I am not exactly sure how the dependency is working, Is spring Kafka integration is based on Spring Kafka. Please give an idea on this? Where can I find proper documentation for

Spring Kafka is Acknowledgement.acknowledge thread safe?

我与影子孤独终老i 提交于 2019-12-23 20:47:17
问题 I am implementing an kafka based application where I would like to manually acknowledge incoming messages. Architecture forces me to do it in a separate thread. The question is: is it possible and safe to execute Acknowledgement.acknowledge() in a different thread than consumer? 回答1: Yes it is, as long as you use MANUAL and not MANUAL_IMMEDIATE , but I don't think you'll get what you expect. Kafka doesn't track each message, just offsets within the partition. Let's say message 1 arrives and

The correct way for creation of KafkaTemplate in spring boot

泪湿孤枕 提交于 2019-12-23 12:21:12
问题 I try configure apache kafka in spring boot application. I read this documentation and follow the steps: 1) I add this lines to aplication.yaml : spring: kafka: bootstrap-servers: kafka_host:9092 producer: key-serializer: org.apache.kafka.common.serialization.StringDeserializer value-serializer: org.apache.kafka.common.serialization.ByteArraySerializer 2) I create new Topic: @Bean public NewTopic responseTopic() { return new NewTopic("new-topic", 5, (short) 1); } And now I want use