AWS s3 download all file inside a specific folder - Using PHP SDK

匿名 (未验证) 提交于 2019-12-03 00:59:01

问题:

Can anyone help me for this one..??

I want to download all files from a folder which is inside my bucket's folder, to my computer's directory with the same name.

Let's say there is a bucket name "ABC" a folder is there inside it, is "DEF".. In which folder there are multiple files available..

Now I want to download it into my project folder "/opt/lampp/htdocs/porject/files/download/" here "DEF" folder is also available..

So, anyone can help me, and give me the code for this..?

Thanks in advance..

=============

ERROR :

Fatal error: Uncaught exception 'UnexpectedValueException' with message 'RecursiveDirectoryIterator::_construct() [recursivedirectoryiterator.--construct]: Unable to find the wrapper "s3" - did you forget to enable it when you configured PHP?' in /opt/lampp/htdocs/demo/amazon-s3/test.php:21 Stack trace: #0 /opt/lampp/htdocs/demo/amazon-s3/test.php(21): RecursiveDirectoryIterator->_construct('s3://bucketname/folder...') #1 {main} thrown in /opt/lampp/htdocs/demo/amazon-s3/test.php on line 21

回答1:

Pretty straightforward using the Amazon S3 stream wrapper:

include dirname(__FILE__) . '/aws.phar'; $baseDirectory = dirname(__FILE__) .'/'.$myDirectoryName;   $client = \Aws\S3\S3Client::factory(array(     'key'    => "<my key>",     'secret' => "<my secret>" ));  $client->registerStreamWrapper();   $bucket = 's3://mys3bucket/' . $myDirectoryName  $iterator = new RecursiveIteratorIterator(     new RecursiveDirectoryIterator($bucket),     RecursiveIteratorIterator::SELF_FIRST );  foreach($iterator as $name => $object) {     if ($object->getFileName() !== '.' && $object->getFileName() !== '..') {         $relative = substr($name,strlen($bucket)+1);         if (!file_exists($baseDirectory . '/' . $path . '/' . $relative)) {             if ($object->isDir()) {                 mkdir($baseDirectory . '/' . $path . '/' . $relative, 0777, true);             } else {                 file_put_contents(                     $baseDirectory . '/' . $path . '/' . $relative,                     file_get_contents($name)                 );             }         }     } } 


回答2:

Mark's answer is totally valid, but there is also an even easier way to do this with the AWS SDK for PHP using the downloadBucket() method. Here's an example (assuming $client is an instance of the S3 client):

$bucket = 'YOUR_BUCKET_NAME'; $directory = 'YOUR_FOLDER_OR_KEY_PREFIX_IN_S3'; $basePath = 'YOUR_LOCAL_PATH/';  $client->downloadBucket($basePath . $directory, $bucket, $directory); 

The cool thing about this method is that it queues up only the files that don't already exist (or haven't been modified) in the local directory, and attempts to download them in parallel, in order speed up the overall download time. There is a 4th argument to the method (see the link) that includes other options like setting how many parallel downloads you want to happen at a time.



标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!