spring-integration-aws

Passing renamed file as input to Outbound adapter/gateway

a 夏天 提交于 2019-12-11 05:45:29
问题 In spring-boot-integration app, wrote a custom locker to rename the original file before locking (fileToLock.getAbsolutePath() + ".lock") and expected lock on file so that any other instance will not be able to process the same file . When file is renaming, its coping the contents from original file and additional file is created with filename.lock with contents and original file also existing with size 0 kb without content. Outbound-gateway taking the original file as input which is without

Spring Aws Kinesis Binder ProvisionedThroughputExceededException while consuming messages in Batch Mode

让人想犯罪 __ 提交于 2019-12-02 20:21:23
问题 I am using the batch mode to pull in the records from kinesis stream. We are using spring aws kinesis binder. Most of the times we are not able to pull messages from stream. Only some times we are able to pull messages from stream. My config looks like below My config spring: cloud: stream: kinesis: binder: locks: leaseDuration: 30 readCapacity: 1 writeCapacity: 1 checkpoint: readCapacity: 1 writeCapacity: 1 bindings: InStreamGroupOne: consumer: listenerMode: batch idleBetweenPolls: 30000

S3: Outbound adapter to place file in multiple target buckets

泄露秘密 提交于 2019-12-02 10:11:12
have a spring boot application, where I am trying to place a file into multiple S3 bucket using single S3 outbound adapter.. Would like to know if its possible to place the file in multiple bucket using single outbound adapter using spring-integration-aws itself( without using aws -sdk) Any suggestion will be helpful. S3 : Outbound adapter: <int-aws:s3-outbound-channel-adapter id="filesS3Mover" channel="filesS3MoverChannel" transfer-manager="transferManager" bucket="${aws.s3.target.bucket}" key-expression="headers.targetsystem-folder/headers.file_name" command="UPLOAD"> </int-aws:s3-outbound

Spring Aws Kinesis Binder ProvisionedThroughputExceededException while consuming messages in Batch Mode

﹥>﹥吖頭↗ 提交于 2019-12-02 08:00:18
I am using the batch mode to pull in the records from kinesis stream. We are using spring aws kinesis binder. Most of the times we are not able to pull messages from stream. Only some times we are able to pull messages from stream. My config looks like below My config spring: cloud: stream: kinesis: binder: locks: leaseDuration: 30 readCapacity: 1 writeCapacity: 1 checkpoint: readCapacity: 1 writeCapacity: 1 bindings: InStreamGroupOne: consumer: listenerMode: batch idleBetweenPolls: 30000 recordsLimit: 5000 consumer-backoff: 1000 bindings: InStreamGroupOne: group: in-stream-group destination: