How to download all files from s3 bucket to local linux server while passing bucket and local folder value at runtime using python

社会主义新天地 提交于 2020-01-25 08:30:13

问题


I am making script to download files form s3 bucket to local linux folder. To achieve that i have to use dynamic values for buckets and folders where we want to download stuff.

I know how to do with

aws s3 cp s3://bucket /linux/local/folder --recursive --p alusta

But how to accept bucket value at runtime

dwn_cmd = "aws s3 cp s3://bucket/name/" + str(year_name) + '/' + str(month_name)

folder_path = "/local/linux/folder/" + folder_name

#subprocess.call(['aws','s3','cp',dwn_cmd,folder_path,'--recursive','--p', 'alusta'])

This is showing error that subprocess needs s3 bucket path and local folder path. I think it is not picking up the path. If i hard code the path it is working but not with this. How could I achieve my result


回答1:


With

dwn_cmd = "aws s3 cp s3://bucket/name/" + "2019" + '/' + "June"
folder_path = "/local/linux/folder/" + "test"

You will be calling

subprocess.call(['aws','s3','cp',
 "aws s3 cp s3://bucket/name/2019/June",
 "/local/linux/folder/test",
 '--recursive', '--p', 'alusta']);

Delete the aws s3 cp parameters from dwn_command:

dwn_cmd = "s3://bucket/name/" + "2019" + '/' + "June"

Note: Do not use subprocess.call([dwn_cmd, folder_path,'--recursive','--p', 'alusta']) # wrong The space between aws and s3 will be considered as part of the command name, so it would look for the command in a subdirectory of the directory with 3 spaces aws s3 cp s3:.



来源:https://stackoverflow.com/questions/56428313/how-to-download-all-files-from-s3-bucket-to-local-linux-server-while-passing-buc

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!