gsutil cannot copy to s3 due to authentication

你。 提交于 2019-12-04 19:17:55

问题


I need to copy many (1000+) files to s3 from GCS to leverage an AWS lambda function. I have edited ~/.boto.cfg and commented out the 2 aws authentication parameters but a simple gsutil ls s3://mybucket fails from either an GCE or EC2 VM.

Error is The authorization mechanism you have provided is not supported. Please use AWS4-HMAC-SHA256..

I use gsutil version: 4.28 and locations of GCS and S3 bucket are respectively US-CENTRAL1 and US East (Ohio) - in case this is relevant.

I am clueless as the AWS key is valid and I enabled http/https. Downloading from GCS and uploading to S3 using my laptop's Cyberduck is impracticable (>230Gb)


回答1:


As per https://issuetracker.google.com/issues/62161892, gsutil v4.28 does support AWS v4 signatures by adding to ~/.boto a new [s3] section like

[s3]
# Note that we specify region as part of the host, as mentioned in the AWS docs:
# http://docs.aws.amazon.com/general/latest/gr/rande.html#s3_region
host = s3.eu-east-2.amazonaws.com
use-sigv4 = True

The use of that section is inherited from boto3 but is currently not created by gsutil config so it needs to be added explicitly for the target endpoint.

For s3-to-GCS, I will consider the more server-less Storage Transfer Service API.




回答2:


I had a similar problem. Here is what I ended up doing on a GCE machine:

Step 1: Using gsutil, I copied files from GCS to my GCE hard drive Step 2: Using aws cli (aws s3 cp ...), I copied files from GCE hard drive to s3 bucket

The above methodology has worked reliably for me. I tried using gsutil rsync but it fail unexpectedly.

Hope this helps



来源:https://stackoverflow.com/questions/47929964/gsutil-cannot-copy-to-s3-due-to-authentication

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!