spring-kafka

How to use “li-apache-kafka-clients” in spring boot app to send large message (above 1MB) from Kafka producer?

落花浮王杯 提交于 2020-05-17 08:06:07
问题 How to use li-apache-kafka-clients in spring boot app to send large message (above 1MB) from Kafka producer to Kafka Consumer? Below is the GitHub link of li-apache-kafka-clients: https://github.com/linkedin/li-apache-kafka-clients I have imported .jar file of li-apache-kafka-clients and put the below configuration for producer: props.put("large.message.enabled", "true"); props.put("max.message.segment.bytes", 1000 * 1024); props.put("segment.serializer", DefaultSegmentSerializer.class

SSL.keystore.location can't find JKS file in my Kubernetes secrets mount

跟風遠走 提交于 2020-05-17 07:45:39
问题 I have created a secret for my JKS file under volume mount /etc/secrets/keystore. I am accessing my JKS file path as an environment variable where ssl.keystore.location gets resolved as file:///etc/secrets/keystore/ssl.jks. But I get exception from SSL engine builder that modification time of keystore couldn't be found and java.nio.file.NoSuchFile Exception file:/etc/secrets/keystore/ssl.jks 回答1: Remove the file:// . The keystore is opened by the Kafka client, not Spring. Kafka knows nothing

Spring Kafka Without spring boot consumer not consuming messages

ぐ巨炮叔叔 提交于 2020-05-17 06:22:26
问题 Consumer using Spring’s JavaConfig class as follows: @Configuration @EnableKafka public class KafkaConfig { public static final String TOPIC = "test-1"; private String bootstrapServers = "localhost:9092"; @Bean public ConcurrentKafkaListenerContainerFactory<String, String> kafkaListenerContainerFactory() { ConcurrentKafkaListenerContainerFactory<String, String> factory = new ConcurrentKafkaListenerContainerFactory<>(); factory.setConsumerFactory(consumerFactory()); return factory; } @Bean

How to expose kafka metrics to /actuator/metrics with spring boot 2

匆匆过客 提交于 2020-05-14 18:13:48
问题 I was looking a while and didn't seem to find the answer. I'm using Spring boot 2, Spring Kafka 2.1.4 and I want to see the kafka consumer metrics in the /metrics endpoint of spring boot actuator. What I don't understand is - should I implenemt the exposure myself or is this comes out-of-box in boot 2 ? If I to implement this my self, what is the best way to do it? 回答1: Targeted for micrometer v1.1.0 is the KafkaConsumerMetrics implementation of MeterBinder . This should expose the kafka

Spring @KafkaListener and concurrency

浪子不回头ぞ 提交于 2020-05-14 07:52:25
问题 I am working with spring boot + spring @KafkaListener. And the behavior I expect is: my kafka listener reads messages in 10 threads. So that, if one of threads hangs, other messages are would continue reading and handling messages. I defined bean of @Bean public ConcurrentKafkaListenerContainerFactory<?, ?> kafkaListenerContainerFactory( ConcurrentKafkaListenerContainerFactoryConfigurer configurer, ConsumerFactory<Object, Object> kafkaConsumerFactory) { ConcurrentKafkaListenerContainerFactory

Spring Kafka - Manual acknowledgement

拟墨画扇 提交于 2020-05-14 05:18:07
问题 I have a spring-boot application which listens to the Kafka stream and sends the record to some service for further processing. The service might fail sometime. The exception scenario is mentioned in the comments. As of now, I mocked the service success and exception scenarios on my own. Listener code: @Autowired PlanitService service @KafkaListener( topics = "${app.topic}", groupId = "notifGrp", containerFactory = "storeKafkaListener") public void processStoreNotify(StoreNotify store) throws

Spring Boot / Kafka Json Deserialization - Trusted Packages

蓝咒 提交于 2020-05-13 04:47:36
问题 I am just starting to use Kafka with Spring Boot & want to send & consume JSON objects. I am getting the following error when I attempt to consume an message from the Kafka topic: org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition dev.orders-0 at offset 9903. If needed, please seek past the record to continue consumption. Caused by: java.lang.IllegalArgumentException: The class 'co.orders.feedme.feed.domain.OrderItem' is not in the trusted

Spring Boot / Kafka Json Deserialization - Trusted Packages

此生再无相见时 提交于 2020-05-13 04:47:23
问题 I am just starting to use Kafka with Spring Boot & want to send & consume JSON objects. I am getting the following error when I attempt to consume an message from the Kafka topic: org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition dev.orders-0 at offset 9903. If needed, please seek past the record to continue consumption. Caused by: java.lang.IllegalArgumentException: The class 'co.orders.feedme.feed.domain.OrderItem' is not in the trusted

Spring Boot / Kafka Json Deserialization - Trusted Packages

谁说胖子不能爱 提交于 2020-05-13 04:46:27
问题 I am just starting to use Kafka with Spring Boot & want to send & consume JSON objects. I am getting the following error when I attempt to consume an message from the Kafka topic: org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition dev.orders-0 at offset 9903. If needed, please seek past the record to continue consumption. Caused by: java.lang.IllegalArgumentException: The class 'co.orders.feedme.feed.domain.OrderItem' is not in the trusted

Spring Kafka JsonDesirialization MessageConversionException failed to resolve class name Class not found

笑着哭i 提交于 2020-05-12 02:56:27
问题 I have two services that should communicate via Kafka . Let's call the first service WriteService and the second service QueryService . On the WriteService side, I have the following configuration for producers. @Configuration public class KafkaProducerConfiguration { @Value("${spring.kafka.bootstrap-servers}") private String bootstrapServers; @Bean public Map<String, Object> producerConfigs() { Map<String, Object> props = new HashMap<>(); // list of host:port pairs used for establishing the