AWS Batch job getting Access Denied on S3 despite user role

陌路散爱 提交于 2020-01-25 08:44:10

问题


I am deploying my first batch job on AWS. When I run my docker image in an EC2 instance, the script called by the job runs fine. I have assigned an IAM role to this instance to allow S3 access.

But when I run the same script as a job on AWS Batch, it fails due to Access Denied errors on S3 access. This is despite the fact that in the Job Definition, I assign an IAM role (created for Elastic Container Service Task) that has full S3 access.

If I launch my batch job with a command that does not access S3, it runs fine.

Since using an IAM role for the job definition seems not to be sufficient, how then do I grant S3 permissions within a Batch Job on AWS?

EDIT

So if I just run aws s3 ls interlinked as my job, that also runs properly. What does not work is running the R script:

 library(aws.s3)
 get_bucket("mybucket")[[1]]

Which fails with Access Denied.

So it seems the issue is either with the aws.s3 package or, more likely, my use of it.


回答1:


The problem turned out to be that I had IAM Roles specified for both my compute environment (more restrictive) and my jobs (less restrictive).

In this scenario (where role based credentials are desired), the aws.s3 R package uses aws.signature and aws.ec2metadata to pull temporary credentials from the role. It pulls the compute environment role (which is an ec2 role), but not the job role.

My solution was just to grant the required S3 permissions to my compute environment's role.



来源:https://stackoverflow.com/questions/53094271/aws-batch-job-getting-access-denied-on-s3-despite-user-role

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!