spring-cloud-stream

Application runtime exceptions are not being sent to errorChannel or ServiceActivator not able to listen on to errorChannel

萝らか妹 提交于 2019-12-13 02:47:36
问题 After listening on a kafka topic using @StreamListener, upon RuntimeException, global erroChannel or topic specific errorChannel (topic.group.errors) not receiving any error message. @ServiceActivator not receiving anything. POM Dependencies : Greenwich.RELEASE <groupId>org.springframework.cloud</groupId> <artifactId>spring-cloud-stream</artifactId> </dependency> <dependency> <groupId>org.springframework.cloud</groupId> <artifactId>spring-cloud-stream-schema</artifactId> </dependency>

Spring Cloud Stream does not create a queue

此生再无相见时 提交于 2019-12-13 00:18:55
问题 I'm trying to configure a simple Spring Cloud Stream application with RabbitMQ. The code I use is mostly taken from spring-cloud-stream-samples. I have an entry point: @SpringBootApplication public class DemoApplication { public static void main(String[] args) { SpringApplication.run(DemoApplication.class, args); } } and a simple messages producer from the example: @EnableBinding(Source.class) public class SourceModuleDefinition { private String format = "yyyy-MM-dd HH:mm:ss"; @Bean

Read stream from input channel

人走茶凉 提交于 2019-12-12 23:13:09
问题 it is possible to read the stream from an input channel without creating a new function and add the streamListenner annotation. I'm using spring cloud stream. Thanks! EDIT Actually, I'm creating a microservice, where I have this method @RequestMapping(method = RequestMethod.POST, value = "/annonces") public void addAnnonce(@RequestBody AnnonceWrapper annonceWrapper) { final Message<AnnonceWrapper> message = MessageBuilder .withPayload(annonceWrapper) .setReplyChannel(messageStream

@PostConstruct and autowiring a MessageChannel

北城以北 提交于 2019-12-12 17:12:11
问题 I'm having a problem with Spring Cloud Stream. The thing is that I have a bean that will write to Kafka as soon as it's created (method annotated with @PostConstruct), so I autowire the appropriate MessageChannel and set the destination and binder properties in application.yml. It goes like this: @Component @RequiredArgsConstructor public class Sender { private final MessageChannel output; @PostConstruct public void start() { output.send(new GenericMessage("Hello world"); } } And application

Publish null/tombstone message with raw headers

为君一笑 提交于 2019-12-12 05:12:38
问题 I am building a Spring Cloud Stream Kafka processor app that will consume raw data with a String key and sometimes a null payload from a Kafka topic. I want to produce to another topic a String key and the null payload (known as a tombstone within Kafka). In order to use raw headers on the message, I need to output a byte[] , but if I encode KafkaNull.INSTANCE into a byte[] it will literally output a String of the object hashcode. If I try to send anything other than a byte[] , I can't use

Can I bind to multiple consumer groups with Spring Cloud Stream?

假如想象 提交于 2019-12-12 04:31:18
问题 I am writing an application that will process event messages (posted to the topic file-upload-completed ). I have multiple endpoints that should consume these messages ( metadata-reader and quota-checker ), and for pragmatic reasons I would like to deploy these endpoints together in an aggregated package. With Spring Cloud Stream, I could use spring.cloud.stream.bindings.file-upload-completed.group=metadata-reader to set the consumer group for the first endpoint; I would also like to process

Configuring Spring Cloud Stream in Camden.SR5 with Spring boot 1.5.1

╄→гoц情女王★ 提交于 2019-12-12 03:58:09
问题 First off, thanks to the Spring team for all their work pushing this work forward! Now that Camden.SR5 is official, I have some questions on how to correctly configure the spring cloud stream kafka binder when using Spring Boot 1.5.1. Spring boot 1.5.1 has auto configuration for kafka and those configuration options seem to be redundant with those in the spring cloud stream kafka binder. Do we use the core spring boot properties (spring.kafka. ) or do we use (spring.cloud.stream.kafka.binder.

Spring Cloud Stream default custom message headers

ε祈祈猫儿з 提交于 2019-12-12 03:54:44
问题 Is there a way to configure the default Message<T> headers when the message is generated from the method return value: @Publisher(channel = "theChannelname") public MyObject someMethod(Object param) { ... return myObject; } or @SendTo("theChannelname") public MyObject someMethod(Object param) { ... return myObject; } In the examples above the Message<MyObject> will be automatically generated. So, how can I control the default message generation? 回答1: You can do that via @Header annotation for

Missing schema module for spring-cloud-stream

a 夏天 提交于 2019-12-12 01:13:40
问题 Trying to use the following example from Spring Docs @Bean public MessageConverter userMessageConverter() throws IOException { AvroSchemaMessageConverter avroSchemaMessageConverter { return new AvroSchemaMessageConverter(MimeType.valueOf("avro/bytes"); } Using Gradle as follows buildscript { ext { springBootVersion = '1.4.2.RELEASE' } dependencies { classpath("org.springframework.boot:spring-boot-gradle-plugin:${springBootVersion}") } } apply plugin: 'org.springframework.boot' dependencies {

Spring-Cloud-Stream-Kafka Custom Health check not working

一笑奈何 提交于 2019-12-11 21:48:36
问题 I am using spring-cloud-stream-kafka in my spring-boot(consumer) application.The health of the app is inaccurate, 'UP' even when the app can't connect to Kafka(Kafka broker is down). I have read articles on kafka health check. It looks like kafka health check is disabled in spring actuator health check. So, I managed to write the following code to enable kafka health check for my app. I think, I am missing some connection between the app config and my code and I don't see the Kafka health