Google Cloud Functions - how do I authenticate to AWS S3 bucket?

帅比萌擦擦* 提交于 2020-05-31 09:15:45

问题


I am trying to get a Google Cloud Function in Python 3.7 to take a file from Google Cloud Storage and upload it into AWS S3. In the command line, I would authenticate with awscli and then use the gsutil cp command to copy the file across. I have translated this process into python as:

import subprocess
def GCS_to_s3(arg1, arg2):
    subprocess.call(["aws configure set aws_access_key_id AKIA********"], shell=True)
    subprocess.call(["aws configure set aws_secret_access_key EgkjntEFFDVej"], shell=True)
    subprocess.call(["gsutil cp gs://test_bucket/gcs_test.csv s3://mybucket)"], shell=True)`

The requirements.txt is:

awscli google-cloud-storage

This function deploys successfully and runs successfully but the output does not show up in the S3 bucket.

What would be the best way of writing such a function?


回答1:


You'll probably want to use the boto3 Python package instead, since the command-line AWS tools aren't available or installable for Cloud Functions. There's a number of ways to configure credentials as well.



来源:https://stackoverflow.com/questions/52693824/google-cloud-functions-how-do-i-authenticate-to-aws-s3-bucket

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!