spring-cloud-stream

Why can't Spring autowire my cloud stream Processor?

孤者浪人 提交于 2019-12-11 19:31:02
问题 I'm trying to implement a basic Processor from spring-cloud-stream. I've done this before on other projects, so I thought I was familiar with it. But this time Spring is having a problem creating via @Autowire my Processor reference inside a @Service component. I thought the important piece was the @EnableBinding(my.class) on the Application, but I have that. The error is No qualifying bean of type 'com.mycompany.config.BizSyncProcessor' available I also tried adding an @Component to the

IdleBetween pools not pulling Messages as specified

故事扮演 提交于 2019-12-11 18:33:12
问题 I am consuming messages in batch mode. I wanted to pull 8 messages each 250 ms from stream. spring: cloud: stream: kinesis: bindings: input: consumer: listenerMode: batch idleBetweenPolls: 250 recordsLimit: 8 bindings: input: group: my-group destination: stream content-type: application/json I have pushed around 100 messages on to stream, and I started the consumer. As per configuration, I am supposed to pull messages each 250 ms. But the poller not pulling messages each 250 ms.

Delayed requeuing through Rabbit DLQ

半腔热情 提交于 2019-12-11 17:42:36
问题 I am trying to set up a queue with delayed requeuing of failed messages following the pattern described here. I tried copying the config example from the docs as closely as possible but the dead letter queue that was created was not itself bound back to the DLX. I am unclear as to why not. I saw another potential solution though and instead of relying on the default behavior, I tried explicitly setting the dlqDeadLetterExchange and dlqDeadLetterRoutingKey properties to see if I could make

How to properly implement a binder?

一曲冷凌霜 提交于 2019-12-11 17:36:11
问题 Question was heavily edited during discussion with Oleg I'm trying to implement a binder for BigQuery in Spring Cloud Stream. Full code of application is avaiable on GitHub. So far, I've written a BigQueryBinderConfiguration class which returns a BigQueryBinder the following way @Configuration @EnableConfigurationProperties({ BigQueryConfiguration.class }) public class BigQueryBinderConfiguration { @Autowired BigQueryConfiguration configuration; @Bean BigQueryBinder

Spring Cloud Stream for Kafka with consumer/producer API exactly once semantics with transaction-id-prefix is not working as expected

烈酒焚心 提交于 2019-12-11 17:22:32
问题 I have scenario where am seeing different behavior. Like total of 3 different services First service will listen from Solace queue and produce it to kafka topic-1 (where transaction are enabled) Second Service will listen from above kafka topic-1 and write it to another kafka topic-2 (where we have no manual commits, transactions enabled to produce to other topic, auto commit offset as false & isolation.level is set to read_commited) ago Delete Third Service will listen from kafka topic-2 and

Need help on Registering App on PCF with Spring Cloud Data Flow which is also on PCF

本小妞迷上赌 提交于 2019-12-11 17:05:13
问题 1) I have registered a sink app on PCF using cf push -p abcdef.jar sinkapp. it went good 2) Now I have my SCDF server also on PCF How can I register sinkapp on the SCDF server using dataflow which is on the same PCF , same org, same space. coz I have no clue what do I reference this to for registering it ? I am looking for the command that I can give to the SCDF from dataflow shell. Thank you. 回答1: I'd highly recommend going through the getting-started experience for Cloud Foundry. You should

Spring Cloud Streams Conditional Forwarding of Data to Kafka Topics

喜欢而已 提交于 2019-12-11 16:59:35
问题 I am trying to send Data to different topics based on some evaluation. I am using SPring CLoud Streams and Kafka How can I conditionally forward to kafka topics. I need to insert SCS-kafka related code in the places where I commented specifically. Thank you. @EnableBinding(Sink.class) public class SampleSink { private final Logger logger = LoggerFactory.getLogger(this.getClass()); @Autowired private SomeService someService; @ServiceActivator(inputChannel = Sink.INPUT) public void processor

Spring Cloud Stream Topic Partitions KStream Read/Write

拈花ヽ惹草 提交于 2019-12-11 15:17:49
问题 I have multiple microservices and fronted with API, like to use same topic for events each domain event on separate partition, i was able to configure spring kafka binder to send to different partition using spring.cloud.stream.bindings.<channel>.producer.partition-key- extractor-name= implementing PartitionKeyExtractorStrategy my question here is can i configure Kstream binder to be able to user partition only for @input and @Output. My understading so far is spring.cloud.stream.kafka

There are producer issues with spring cloud stream 3.0

情到浓时终转凉″ 提交于 2019-12-11 14:59:29
问题 I read about the spring cloud stream 3.0 documents, to understand the new using java.util.function.[Supplier/Function/Consumer] to represent the producers, the consumption and production, consumers, and this should be correct. But I don't understand Supplier. The documentation states that polling for suppliers is used to consistently generate data for suppliers, and no program involvement is required. But many times, we need to generate a data at a specific time, such as a web request, and I

Multiple bindingRoutingKey's for a consumer with Spring Cloud Stream using RabbitMQ

假如想象 提交于 2019-12-11 12:04:57
问题 I'd like to configure an input channel in Spring Cloud Stream to be bound to the same exchange (destination) with multiple routing keys. I've managed to get this working with a single routing key like this: spring: cloud: stream: rabbit: bindings: input1: consumer: bindingRoutingKey: key1.# bindings: input1: binder: rabbit group: group1 destination: dest-group1 But I cannot seem to get it working for multiple keys. I've tried this: spring: cloud: stream: rabbit: bindings: input1: consumer: