spring-cloud-stream

Spring cloud stream - send message after application initalization

主宰稳场 提交于 2019-12-01 21:04:16
I'am trying to send a simple message using "spring cloud stream" to the rabbitmq. Basically code looks like this: @EnableBinding(Source.class) @SpringBootApplication public class SourceApplication { public static void main(String[] args) { SpringApplication.run(SourceApplication.class, args); } @Autowired Source source; @PostConstruct public void init() { source.send(MessageBuilder.withPayload("payload").build()); } } then I get this error message: org.springframework.messaging.MessageDeliveryException: Dispatcher has no subscribers for channel 'unknown.channel.name'.; nested exception is org

Spring Cloud Stream @SendTo Annotation not working

自闭症网瘾萝莉.ら 提交于 2019-12-01 12:30:55
I'm using Spring Cloud Stream with Spring Boot. My application is very simple: ExampleService.class: @EnableBinding(Processor1.class) @Service public class ExampleService { @StreamListener(Processor1.INPUT) @SendTo(Processor1.OUTPUT) public String dequeue(String message){ System.out.println("New message: " + message); return message; } @SendTo(Processor1.OUTPUT) public String queue(String message){ return message; } } Procesor1.class: public interface Processor1 { String INPUT = "input1"; String OUTPUT = "output1"; @Input(Processor1.INPUT) SubscribableChannel input1(); @Output(Processor1

Spring Cloud Stream @SendTo Annotation not working

我的梦境 提交于 2019-12-01 12:00:10
问题 I'm using Spring Cloud Stream with Spring Boot. My application is very simple: ExampleService.class: @EnableBinding(Processor1.class) @Service public class ExampleService { @StreamListener(Processor1.INPUT) @SendTo(Processor1.OUTPUT) public String dequeue(String message){ System.out.println("New message: " + message); return message; } @SendTo(Processor1.OUTPUT) public String queue(String message){ return message; } } Procesor1.class: public interface Processor1 { String INPUT = "input1";

Tombstone messages not removing record from KTable state store?

ぃ、小莉子 提交于 2019-12-01 08:02:33
I am creating KTable processing data from KStream. But when I trigger a tombstone messages with key and null payload, it is not removing message from KTable. sample - public KStream<String, GenericRecord> processRecord(@Input(Channel.TEST) KStream<GenericRecord, GenericRecord> testStream, KTable<String, GenericRecord> table = testStream .map((genericRecord, genericRecord2) -> KeyValue.pair(genericRecord.get("field1") + "", genericRecord2)) .groupByKey() reduce((genericRecord, v1) -> v1, Materialized.as("test-store")); GenericRecord genericRecord = new GenericData.Record(getAvroSchema(keySchema

Spring Cloud Stream message from/to JSON conversion configuration

泄露秘密 提交于 2019-12-01 05:31:36
I am using Spring Cloud Stream, with RabbitMQ binder. It works great with byte[] payload and Java native serialization, but I need to work with JSON payload. Here's my processor class. @EnableBinding(Processor.class) public class MessageProcessor { @ServiceActivator(inputChannel = Processor.INPUT, outputChannel = Processor.OUTPUT) public OutputDto handleIncomingMessage(InputDto inputDto) { // Run some job. return new OutputDto(); } } InputDto and OutputDto are POJOs with Jackson annotations. How do I configure JSON conversion strategy? How should message headers look like to be accepted and

What are the Benefits of Spring Cloud Dataflow?

杀马特。学长 韩版系。学妹 提交于 2019-12-01 05:20:50
Based on what I've seen, creating a stream in Spring Cloud Dataflow (SCDF) will deploy the underlying applications, bind the communication service (like RabbitMQ), set the Spring Cloud Stream environment variables, and start the applications. This could all be done manually easily using a cf push command. Meanwhile, I've been running into some drawbacks with Spring Cloud Dataflow: SCDF Server is a memory hog on PCF (I have a stream with only 6 applications, and yet I'm needing about 10GB for the server) No flexibility on application naming, memory, instances, etc. (All the things that you

Spring Cloud Stream message from/to JSON conversion configuration

旧巷老猫 提交于 2019-12-01 03:48:45
问题 I am using Spring Cloud Stream, with RabbitMQ binder. It works great with byte[] payload and Java native serialization, but I need to work with JSON payload. Here's my processor class. @EnableBinding(Processor.class) public class MessageProcessor { @ServiceActivator(inputChannel = Processor.INPUT, outputChannel = Processor.OUTPUT) public OutputDto handleIncomingMessage(InputDto inputDto) { // Run some job. return new OutputDto(); } } InputDto and OutputDto are POJOs with Jackson annotations.

What are the Benefits of Spring Cloud Dataflow?

情到浓时终转凉″ 提交于 2019-12-01 03:46:32
问题 Based on what I've seen, creating a stream in Spring Cloud Dataflow (SCDF) will deploy the underlying applications, bind the communication service (like RabbitMQ), set the Spring Cloud Stream environment variables, and start the applications. This could all be done manually easily using a cf push command. Meanwhile, I've been running into some drawbacks with Spring Cloud Dataflow: SCDF Server is a memory hog on PCF (I have a stream with only 6 applications, and yet I'm needing about 10GB for

How to implement a microservice Event Driven architecture with Spring Cloud Stream Kafka and Database per service

蓝咒 提交于 2019-11-30 13:26:40
问题 I am trying to implement an event driven architecture to handle distributed transactions. Each service has its own database and uses Kafka to send messages to inform other microservices about the operations. An example: Order service -------> | Kafka |------->Payment Service | | Orders MariaDB DB Payment MariaDB Database Order receives an order request. It has to store the new Order in its DB and publish a message so that Payment Service realizes it has to charge for the item: private

Spring cloud stream to support routing messages dynamically

我们两清 提交于 2019-11-29 13:15:21
i want to create a common project (using spring cloud stream) to route messages to different (consumer) projects dynamically according to message content. (rabbitmq as the message broker) does spring cloud stream support it? if not, any proposed way to accomplish that? thx You can achieve that by setting spring.cloud.stream.dynamicDestinations property to a list of destination names (if you know the name beforehand) or keeping it as empty. The BinderAwareChannelResolver takes care of dynamically creating/binding the outbound channel for these dynamic destinations. There is an out of the box