spring-cloud-aws

Unable to consume messages as batch mode in Kinesis Binder

我是研究僧i 提交于 2019-12-13 18:47:03
问题 I am trying to consume messages from Kinesis stream as batch I am using compile('org.springframework.cloud:spring-cloud-starter-stream-kinesis:1.0.0.BUILD-SNAPSHOT') Application.yml spring: cloud: stream: bindings: input: group: groupName destination: stream-name content-type: application/json consumer: listenerMode: batch idleBetweenPolls: 10000 Code As per the documentation, when listenerMode is batch, then it is expected to have list as payload @StreamListener(Sink.INPUT) public void

IdleBetween pools not pulling Messages as specified

故事扮演 提交于 2019-12-11 18:33:12
问题 I am consuming messages in batch mode. I wanted to pull 8 messages each 250 ms from stream. spring: cloud: stream: kinesis: bindings: input: consumer: listenerMode: batch idleBetweenPolls: 250 recordsLimit: 8 bindings: input: group: my-group destination: stream content-type: application/json I have pushed around 100 messages on to stream, and I started the consumer. As per configuration, I am supposed to pull messages each 250 ms. But the poller not pulling messages each 250 ms.

Spring Cloud AWS SQS SendTo annotation with property placeholder

老子叫甜甜 提交于 2019-12-11 08:37:15
问题 This issue suggests that the @SendTo annotation supports property placeholders, but I can't get it to work. Here's some simplified code snippets of what I'm trying to do (easier than trying to explain with words). I'm on spring-cloud-aws version 1.2.1. This works: @Component public class InputQueueListener { @Value("${replyQueueProperty}") private String replyQueue; @Autowired private QueueMessagingTemplate messagingTemplate; @SqsListener(value = "${inputQueueProperty}", deletionPolicy =

How to process more than 10 concurrent messages from an AWS SQS FiFo queue using Spring Integration

拥有回忆 提交于 2019-12-10 19:16:49
问题 I want to be able to process more than 10 SQS messages at a time using a Spring Integration Workflow. From this question, the recommendation was to use an ExecutorChannel . I updated my code but still have the same symptoms. How execute Spring integration flow in multiple threads to consume more Amazon SQS queue messages in parallel? After making this update, my application requests 10 messages, processes those, and only after I make the call to amazonSQSClient.deleteMessage near the end of

How to delete file from S3 using Spring cloud AWS?

元气小坏坏 提交于 2019-12-10 18:55:52
问题 I could not find any API or documentation in Spring AWS Cloud to delete an object from S3 bucket. Can someone please let me know how to do it? The documentation just talks about reading the content using ResourceLoader . Only option right now I see is to explicitly inject AmazonS3 and call deleteObject . 回答1: Spring's Resource API does not support the full lifecycle of operations. The two main interfaces are Resource and WritableResource . There is no API for deletion. As an alternative you

Springboot with Spring-cloud-aws and cloudwatch metrics

不羁的心 提交于 2019-12-07 04:11:22
问题 I would like to start using metrics in my Springboot app and I would also like to publish them my amazon cloudwatch I know that with Springboot we can activate spring-actuator that provides in memory metrics and published them to the /metrics endpoint. I stumbled across Spring-cloud that seems to have some lib to periodically publish these metrics to Cloudwatch, however I have no clue how to set them up? There is absolutely 0 examples of how to use it. Anyone could explain what are the step

Springboot with Spring-cloud-aws and cloudwatch metrics

孤街浪徒 提交于 2019-12-05 08:03:17
I would like to start using metrics in my Springboot app and I would also like to publish them my amazon cloudwatch I know that with Springboot we can activate spring-actuator that provides in memory metrics and published them to the /metrics endpoint. I stumbled across Spring-cloud that seems to have some lib to periodically publish these metrics to Cloudwatch, however I have no clue how to set them up? There is absolutely 0 examples of how to use it. Anyone could explain what are the step to enable the metric to be sent to cloudwatch? You can check my article here: https://dkublik.github.io

Spring Aws Kinesis Binder ProvisionedThroughputExceededException while consuming messages in Batch Mode

让人想犯罪 __ 提交于 2019-12-02 20:21:23
问题 I am using the batch mode to pull in the records from kinesis stream. We are using spring aws kinesis binder. Most of the times we are not able to pull messages from stream. Only some times we are able to pull messages from stream. My config looks like below My config spring: cloud: stream: kinesis: binder: locks: leaseDuration: 30 readCapacity: 1 writeCapacity: 1 checkpoint: readCapacity: 1 writeCapacity: 1 bindings: InStreamGroupOne: consumer: listenerMode: batch idleBetweenPolls: 30000

Spring Aws Kinesis Binder ProvisionedThroughputExceededException while consuming messages in Batch Mode

﹥>﹥吖頭↗ 提交于 2019-12-02 08:00:18
I am using the batch mode to pull in the records from kinesis stream. We are using spring aws kinesis binder. Most of the times we are not able to pull messages from stream. Only some times we are able to pull messages from stream. My config looks like below My config spring: cloud: stream: kinesis: binder: locks: leaseDuration: 30 readCapacity: 1 writeCapacity: 1 checkpoint: readCapacity: 1 writeCapacity: 1 bindings: InStreamGroupOne: consumer: listenerMode: batch idleBetweenPolls: 30000 recordsLimit: 5000 consumer-backoff: 1000 bindings: InStreamGroupOne: group: in-stream-group destination: