spring-cloud-stream

Manual Acknowledgement of Messages : Spring Cloud Stream Kafka

[亡魂溺海] 提交于 2019-12-04 17:13:04
The scenario i want to implement is consume a message from Kafka , process it, if some condition fails i do not wish to acknowledge the message. For this i found in the spring cloud stream reference documentation, autoCommitOffset Whether to autocommit offsets when a message has been processed. If set to false, an Acknowledgment header will be available in the message headers for late acknowledgment. Default: true. My question is after setting autoCommitOffset to false, how can i acknowledge a message? A Code example would be hugely appreciated. I've provided an answer to the question here

Spring Cloud Stream dynamic channels

为君一笑 提交于 2019-12-04 07:01:10
I am using Spring Cloud Stream and want to programmatically create and bind channels. My use case is that during application startup I receive the dynamic list of Kafka topics to subscribe to. How can I then create a channel for each topic? I ran into similar scenario recently and below is my sample of creating SubscriberChannels dynamically. ConsumerProperties consumerProperties = new ConsumerProperties(); consumerProperties.setMaxAttempts(1); BindingProperties bindingProperties = new BindingProperties(); bindingProperties.setConsumer(consumerProperties); bindingProperties.setDestination

Spring Cloud Kafka Stream Unable to create Producer Config Error

情到浓时终转凉″ 提交于 2019-12-03 07:18:26
I have two Spring boot project with Kafka-stream dependencies, they have exactly same dependencies in gradle and exactly same configurations, yet one of the project when started logs error as below 11:35:37.974 [restartedMain] INFO o.a.k.c.admin.AdminClientConfig - AdminClientConfig values: bootstrap.servers = [192.169.0.109:6667] client.id = client connections.max.idle.ms = 300000 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 receive.buffer.bytes = 65536 reconnect.backoff.max.ms = 1000 reconnect

Spring Aws Kinesis Binder ProvisionedThroughputExceededException while consuming messages in Batch Mode

让人想犯罪 __ 提交于 2019-12-02 20:21:23
问题 I am using the batch mode to pull in the records from kinesis stream. We are using spring aws kinesis binder. Most of the times we are not able to pull messages from stream. Only some times we are able to pull messages from stream. My config looks like below My config spring: cloud: stream: kinesis: binder: locks: leaseDuration: 30 readCapacity: 1 writeCapacity: 1 checkpoint: readCapacity: 1 writeCapacity: 1 bindings: InStreamGroupOne: consumer: listenerMode: batch idleBetweenPolls: 30000

Spring Cloud DataFlow for HTTP request/response exchange

房东的猫 提交于 2019-12-02 09:36:20
问题 I would like to use streams to handle an HTTP request/response exchange. I didn't see any Spring Cloud Stream App Starters with HTTP sink functionality. Will I need to build a custom sink to handle the response? If so, do I pass the request through my processing pipeline, then use the request in my sink to form the response? I don't think I've misunderstood the use case of Spring Cloud DataFlow and Spring Cloud Stream. Perhaps there are app starters available for this pattern. 回答1: Spring

Spring Aws Kinesis Binder ProvisionedThroughputExceededException while consuming messages in Batch Mode

﹥>﹥吖頭↗ 提交于 2019-12-02 08:00:18
I am using the batch mode to pull in the records from kinesis stream. We are using spring aws kinesis binder. Most of the times we are not able to pull messages from stream. Only some times we are able to pull messages from stream. My config looks like below My config spring: cloud: stream: kinesis: binder: locks: leaseDuration: 30 readCapacity: 1 writeCapacity: 1 checkpoint: readCapacity: 1 writeCapacity: 1 bindings: InStreamGroupOne: consumer: listenerMode: batch idleBetweenPolls: 30000 recordsLimit: 5000 consumer-backoff: 1000 bindings: InStreamGroupOne: group: in-stream-group destination:

Spring Cloud DataFlow for HTTP request/response exchange

♀尐吖头ヾ 提交于 2019-12-02 03:41:11
I would like to use streams to handle an HTTP request/response exchange. I didn't see any Spring Cloud Stream App Starters with HTTP sink functionality. Will I need to build a custom sink to handle the response? If so, do I pass the request through my processing pipeline, then use the request in my sink to form the response? I don't think I've misunderstood the use case of Spring Cloud DataFlow and Spring Cloud Stream. Perhaps there are app starters available for this pattern. Spring Cloud Stream/Dataflow is for unidirectional (stream) processing; it is not intended for request/reply

Health for Kafka Binder is always UNKNOWN

末鹿安然 提交于 2019-12-02 03:25:17
When I try to activate the health indicator for the kafka binder as explained in Spring Cloud Stream Reference Documentation the health endpoint returns: binders":{"status":"UNKNOWN","kafka":{"status":"UNKNOWN"}}} my configuration contains as documented: management.health.binders.enabled=true I already debugged BindersHealthIndicatorAutoConfiguration and noticed, that no HealthIndicator is registered in the binderContext . Do I have to register a custom HealthIndicator as bean or what steps are necessary? It looks like a bug in the documentation. By default, the binders health indicators are

Spring cloud stream kafka pause/resume binders

感情迁移 提交于 2019-12-02 03:07:58
问题 We are using spring cloude stream 2.0 & Kafka as a message broker. We've implemented a circuit breaker which stops the Application context, for cases where the target system (DB or 3rd party API) is unavilable, as suggested here: Stop Spring Cloud Stream @StreamListener from listening when target system is down Now in spring cloud stream 2.0 there is a way to manage the lifecycle of binder using actuator: Binding visualization and control Is it possible to control the binder lifecycle from

Spring cloud stream kafka pause/resume binders

ⅰ亾dé卋堺 提交于 2019-12-02 00:36:41
We are using spring cloude stream 2.0 & Kafka as a message broker. We've implemented a circuit breaker which stops the Application context, for cases where the target system (DB or 3rd party API) is unavilable, as suggested here: Stop Spring Cloud Stream @StreamListener from listening when target system is down Now in spring cloud stream 2.0 there is a way to manage the lifecycle of binder using actuator: Binding visualization and control Is it possible to control the binder lifecycle from the code, means in case target server is down, to pause the binder, and when it's up, to resume ? Sorry,