Problem synchronizing 'bucket' to local directory with Spring Cloud DataFlow Streams

不打扰是莪最后的温柔 提交于 2021-01-29 09:28:41

问题


I'm following this Case Study, which is similar to mine where I want to receive thousand of files in a S3 bucket and launch the batch task which will consume them.

But I'm getting:

Problem occurred while synchronizing 'bucket' to local directory; nested exception is org.springframework.messaging.MessagingException: Failed to execute on session; nested exception is com.amazonaws.services.s3.model.AmazonS3Exception: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied;

I already consume this bucket using spring-cloud-starter-aws dependency in some apps.

I know the message is pretty clear, but should I have specific permissions in a bucket when I need to sync like this with Spring Cloud DataFlow?

My current Stream config is:

s3 
--spring.cloud.function.definition=s3Supplier,taskLaunchRequestFunction 
--file.consumer.mode=ref 
--s3.common.path-style-access=true 
--s3.supplier.remote-dir=mybucket 
--s3.supplier.local-dir=/scdf/infile 
--cloud.aws.credentials.accessKey=**** 
--cloud.aws.credentials.secretKey=**** 
--cloud.aws.region.static=**** 
--cloud.aws.stack.auto=false 
--task.launch.request.taskName=bill-composed-task 
| 
task-launcher-dataflow 
--spring.cloud.dataflow.client.server-uri=http://localhost:9393

Thanks in advance

来源:https://stackoverflow.com/questions/65646098/problem-synchronizing-bucket-to-local-directory-with-spring-cloud-dataflow-str

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!