spring-cloud-stream

Spring boot Test fails saying, Unable to start ServletWebServerApplicationContext due to missing ServletWebServerFactory bean

天涯浪子 提交于 2019-12-28 12:31:25
问题 Test Class:- @RunWith(SpringRunner.class) @SpringBootTest(classes = { WebsocketSourceConfiguration.class, WebSocketSourceIntegrationTests.class }, webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT, properties = { "websocket.path=/some_websocket_path", "websocket.allowedOrigins=*", "spring.cloud.stream.default-binder=kafka" }) public class WebSocketSourceIntegrationTests { private String port = "8080"; @Test public void testWebSocketStreamSource() throws IOException,

Is it possible to get exactly once processing with Spring Cloud Stream?

自作多情 提交于 2019-12-25 01:50:26
问题 Currently I'm using SCS with almost default configuration for sending and receiving message between microservices. Somehow I've read this https://www.confluent.io/blog/enabling-exactly-kafka-streams and wonder that it is gonna works or not if we just put the property called "processing.guarantee" with value "exactly-once" there through properties in Spring boot application ? 回答1: In the context of your question you should look at Spring Cloud Stream as just a delegate between target system (e

Spring Integration: Binder configuration (for Rabbit)

别来无恙 提交于 2019-12-24 22:02:23
问题 I am trying to configure as Middleware for a Spring Integration application. here is application.yml file: spring: application: name: ${vcap.application.name:} cloud: stream: bindings: myOutput: destination: myInput default-binder: rabbit The error I get is this: nested exception is java.lang.IllegalStateException: Unknown binder configuration: rabbit any ideas how to correctly configure rabbit for sprint integration. The project is divided into multiple modules, I was given this code form a

How to catch errors from the Spring Integration error channel inside Spring Cloud Stream?

假装没事ソ 提交于 2019-12-24 20:29:22
问题 I'm trying to create a application-level error handler for failures during the processing inside SCS application using Kafka as a message broker. I know that SCS already provides the DLQ functionality, but in my case I want to wrap failed messages with a custom wrapper type (providing the failure context (source, cause etc.)) In https://github.com/qabbasi/Spring-Cloud-Stream-DLQ-Error-Handling you can see two approaches for this scenario: one is using SCS and the other one directly Spring

Spring @StreamListener process(KStream<?,?> stream) Partition

拟墨画扇 提交于 2019-12-24 20:14:23
问题 I have a topic with multiple partitions in my stream processor i just wanted to stream that from one partition, and could nto figure out how to configure this spring.cloud.stream.kafka.streams.bindings.input.consumer.application-id=s-processor spring.cloud.stream.bindings.input.destination=uinput spring.cloud.stream.bindings.input.group=r-processor spring.cloud.stream.bindings.input.contentType=application/java-serialized-object spring.cloud.stream.bindings.input.consumer.header-mode=raw

Change content type for RabbitMQ Spring Cloud Stream Starter App

青春壹個敷衍的年華 提交于 2019-12-24 12:09:52
问题 The documentation for the Spring Cloud Stream Starter Apps for the RabbitMQ Source app lists several possible content types, each with a different resulting type for the output payload. However, it doesn't say how to choose which one you want to use. I'm deploying a Spring Cloud Data Flow connecting the Rabbit source to a Log sink, and all I get is the byte array. Even when I explicitly set the content type to "text/plain" in the Rabbit message's header, it shows up in the log sink as a byte

Change content type for RabbitMQ Spring Cloud Stream Starter App

你说的曾经没有我的故事 提交于 2019-12-24 12:07:06
问题 The documentation for the Spring Cloud Stream Starter Apps for the RabbitMQ Source app lists several possible content types, each with a different resulting type for the output payload. However, it doesn't say how to choose which one you want to use. I'm deploying a Spring Cloud Data Flow connecting the Rabbit source to a Log sink, and all I get is the byte array. Even when I explicitly set the content type to "text/plain" in the Rabbit message's header, it shows up in the log sink as a byte

Failed to create consumer binding; retrying in 30 seconds

廉价感情. 提交于 2019-12-24 08:00:08
问题 I am using spring cloud stream. I have two channel. one is use kafka cluster1, one is use cluster2. config is like. spring.cloud.stream.default-binder=kafka spring.cloud.stream.binders.kafka.type=kafka spring.cloud.stream.binders.kafka.environment.spring.cloud.stream.kafka.binder.brokers=xxxx spring.cloud.stream.kafka.binder.auto-add-partitions=false spring.cloud.stream.kafka.binder.auto-create-topics=false spring.cloud.stream.bindings.channel1-input.destination=top1 spring.cloud.stream

Stop Spring Cloud Stream @StreamListener from listening when target system is down

拥有回忆 提交于 2019-12-23 17:27:40
问题 I have an application that gets messages from Kafka and calls a target system to update a legacy Oracle DB. I want to enable a scenario where if the target system is down, to leave the messages on Kafka bus and not process them for a given period of time. I was thinking of some Circuit-breaker Hystrix-based solution, but I can't find any mechanism to tell Spring Cloud Stream to "stop" the event listening. The only other alternative I can think of is if the circuit breaker is open, to transfer

App properties in Spring Cloud Data Flow application

与世无争的帅哥 提交于 2019-12-23 04:24:06
问题 Based on the documentation for Spring Cloud Data Flow (SCDF) only properties that are prefixed by either "deployed." or "app." are considered when deploying an application (be it a source, processor or sink) as part of a stream. However, I've noticed that besides the prefix, all the properties must be provided as "strings", no matter what their original type is; otherwise, they are simply discarded by SCDF as per this line of code: propertiesToUse = DeploymentPropertiesUtils.convert(props);