Copy multiple files from s3 bucket

爷,独闯天下 提交于 2020-07-16 22:07:00

问题


I am having trouble downloading multiple files from AWS S3 buckets to my local machine.

I have all the filenames that I want to download and I do not want others. How can I do that ? Is there any kind of loop in aws-cli I can do some iteration ?

There are couple hundreds files I need to download so that it seems not possible to use one single command that takes all filenames as arguments.


回答1:


There is a bash script which can read all the filenames from a file filename.txt.

#!/bin/bash  
set -e  
while read line  
do  
  aws s3 cp s3://bucket-name/$line dest-path/  
done <filename.txt



回答2:


Also one can use the --recursive option, as described in the documentation for cp command. It will copy all objects under a specified prefix recursively.

Example:

aws s3 cp s3://folder1/folder2/folder3 . --recursive

will grab all files under folder1/folder2/folder3 and copy them to local directory.




回答3:


You might want to use "sync" instead of "cp". The following will download/sync only the files with the ".txt" extension in your local folder:

aws s3 sync --exclude="*" --include="*.txt" s3://mybucket/mysubbucket .



回答4:


As per the doc you can use include and exclude filters with s3 cp as well. So you can do something like this:

aws s3 cp s3://bucket/folder/ . --recursive --exclude="*" --include="2017-12-20*"

Make sure you get the order of exclude and include filters right as that could change the whole meaning.




回答5:


Tried all the above. Not much joy. Finally, adapted @rajan's reply into a one-liner:

for file in whatever*.txt; do { aws s3 cp $file s3://somewhere/in/my/bucket/; } done



回答6:


I got the problem solved, may be a little bit stupid, but it works.

Using python, I write multiple line of AWS download commands on one single .sh file, then I execute it on the terminal.



来源:https://stackoverflow.com/questions/38021661/copy-multiple-files-from-s3-bucket

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!