spring-cloud-stream

what is the property to accept binary json message in spring-cloud-stream kafka binder

守給你的承諾、 提交于 2019-12-23 02:17:33
问题 I am using spring-cloud-stream kafka binder to consume messages from a kafka topic. The source system is sending the json message in ascii. When My consumer listens to the topic it throws o.s.c.s.b.k.KafkaMessageChannelBinder : Could not convert message: 7B22736.. Is there any property that I can set in my .yml file to deserialize it? or is there an example that I can look into? 回答1: I am not sure what you mean by json in hexadecimal-binary data if you mean it's ascii data in a byte[] , try

Spring Cloud Stream and @Publisher annotation compatiblity

我是研究僧i 提交于 2019-12-20 02:40:09
问题 Since Spring Cloud Stream has not an annotation for sending a new message to a stream (@SendTo only works when @StreamListener is declared), I tried to use Spring Integration annotation for that purpose, that is @Publisher. Because @Publisher takes a channel and @EnableBinding annotations of Spring Cloud Stream can bind an output channel using @Output annotation, I tried to mix them in the following way: @EnableBinding(MessageSource.class) @Service public class ExampleService { @Publisher

How to create unit test with kafka embedded in the spring cloud stream

て烟熏妆下的殇ゞ 提交于 2019-12-19 03:14:08
问题 Sorry for the question being too generic, but someone has some tutorial or guide on how to perform producer and consumer testing with kafka embedded. I've tried several, but there are several versions of dependencies and none actually works =/ I'm using spring cloud stream kafka. 回答1: We generally recommend using the Test Binder in tests but if you want to use an embedded kafka server, it can be done... Add this to your POM... <dependency> <groupId>org.springframework.kafka</groupId>

How can @MessagingGateway be configured with Spring Cloud Stream MessageChannels?

烂漫一生 提交于 2019-12-18 06:23:10
问题 I have developed asynchronous Spring Cloud Stream services, and I am trying to develop an edge service that uses @MessagingGateway to provide synchronous access to services that are async by nature. I am currently getting the following stack trace: Caused by: org.springframework.messaging.core.DestinationResolutionException: no output-channel or replyChannel header available at org.springframework.integration.handler.AbstractMessageProducingHandler.sendOutput(AbstractMessageProducingHandler

Unable to consume messages as batch mode in Kinesis Binder

我是研究僧i 提交于 2019-12-13 18:47:03
问题 I am trying to consume messages from Kinesis stream as batch I am using compile('org.springframework.cloud:spring-cloud-starter-stream-kinesis:1.0.0.BUILD-SNAPSHOT') Application.yml spring: cloud: stream: bindings: input: group: groupName destination: stream-name content-type: application/json consumer: listenerMode: batch idleBetweenPolls: 10000 Code As per the documentation, when listenerMode is batch, then it is expected to have list as payload @StreamListener(Sink.INPUT) public void

Spring Data Flow w/ 2 sources feeding one processor/sink

[亡魂溺海] 提交于 2019-12-13 07:34:51
问题 I'm looking for some advice on setting up a Spring Data Flow stream for a specific use case. My use case: I have 2 RDBMS and I need to compare the results of queries run against each. The queries should be run roughly simultaneously. Based on the result of the comparison, I should be able to send an email through a custom email sink app which I have created. I envision the stream diagram to look something like this (sorry for the paint): The problem is that SDF does not, to my knowledge,

How to use interactive query within kafka process topology in spring-cloud-stream?

大憨熊 提交于 2019-12-13 04:43:53
问题 Is it possible to use interactive query (InteractiveQueryService) within Spring Cloud Stream the class with @EnableBinding annotation or within the method with @StreamListener? I tried instantiating ReadOnlyKeyValueStore within provided KStreamMusicSampleApplication class and process method but its always null. My @StreamListener method is listening to a bunch of KTables and KStreams and during the process topology e.g filtering, I have to check whether the key from a KStream already exists

Unexpected error code 13 while fetching data Spring-Cloud-Stream Kafka Azure event hub -

雨燕双飞 提交于 2019-12-13 04:27:37
问题 It was working fine suddenly this issue popup since 18th Dec, could not find any clue, what suddenly this exception is coming continuously after 6 month working fine. I am not able to reproduce in local standalone and but it comes in The docker images with micro-services and one of these communicate with Azure even hub . I have search and found this post. In this post also reported same issue what i am facing but could not find any resolution or clue so I am posting this question. Please

Jar not found error while while trying to deploy SCDF Stream

╄→尐↘猪︶ㄣ 提交于 2019-12-13 04:13:15
问题 I registered the sink first as follows: app register --name mysink --type sink --uri file:///Users/swatikaushik/Downloads/kafkaStreamDemo/target/kafkaStreamDemo-0.0.1-SNAPSHOT.jar Then I created a stream stream create --definition “:myKafkaTopic > mysink" --name myStreamName --deploy I got the error Command failed org.springframework.cloud.dataflow.rest.client.DataFlowClientException: File /Users/swatikaushik/Downloads/kafkaStreamDemo/target/kafkaStreamDemo-0.0.1-SNAPSHOT.jar must exist While

one SCDF source, 2 processors but only 1 processes each item

依然范特西╮ 提交于 2019-12-13 03:56:07
问题 My use case is a variation on this: Create Stream with one source, two parallel processors and one sink in Spring Cloud Data Flow In the example, 1 source emits an item to rabbitmq and both processors get it. I want the opposite. I want the source to emit items to rabbitmq but only 1 processor handles each item. Lets pretend I have: 1 source named source 2 processors named processor1 and processor2 So source emits: A, B, C to rabbitmq RabbitMQ will emit A Whichever processor gets A first will