Amazon S3 copy the directory to another directory

假如想象 提交于 2019-12-04 22:57:31

S3 is not a filesystem, it's an object store. Folders don't actually exist in any tangible sense; a folder is just something you can call a shared prefix. Put another way, if you create path/to/one and path/to/two, it doesn't also cause path and path/to to exist. If you see them, that's because some component took a list of objects, split their keys on /, and decided to display that list as a hierarchy.

You want to "duplicate a folder into another folder". Rephrasing this into S3 terms, you want to "duplicate all objects with the same prefix into objects with a different prefix". Saying it that way makes the method clear: get a list of objects with the one prefix, then copy each of them.

One way to do it is using list objects and move each object one by one. Another way is to use s3fuse, which will make your s3 bucket as the local directory and then you can just apply simple command like 'mv' to move the files.

here is some code taken right from amazon. This code duplicates the item a three times to a target, what you need to do is change it so that it loops through each key and add it to the batch.

<?php

// Include the AWS SDK using the Composer autoloader.
require 'vendor/autoload.php';

use Aws\S3\S3Client;

$sourceBucket = '*** Your Source Bucket Name ***';
$sourceKeyname = '*** Your Source Object Key ***';
$targetBucket = '*** Your Target Bucket Name ***';

// Instantiate the client.
$s3 = S3Client::factory();

// Copy an object.
$s3->copyObject(array(
    'Bucket'     => $targetBucket,
    'Key'        => "{$sourceKeyname}-copy",
    'CopySource' => "{$sourceBucket}/{$sourceKeyname}",
));

// Perform a batch of CopyObject operations.
$batch = array();
for ($i = 1; $i <= 3; $i++) {
    $batch[] = $s3->getCommand('CopyObject', array(
        'Bucket'     => $targetBucket,
        'Key'        => "{$sourceKeyname}-copy-{$i}",
        'CopySource' => "{$sourceBucket}/{$sourceKeyname}",
    ));
}
try {
    $successful = $s3->execute($batch);
    $failed = array();
} catch (\Guzzle\Service\Exception\CommandTransferException $e) {
    $successful = $e->getSuccessfulCommands();
    $failed = $e->getFailedCommands();
}

Code for scala (copying between folders in one bucket):

def copyFolders(bucketName: String, srcFolder: String, targetFolder: String): Unit = {
import scala.collection.JavaConversions._
val transferManager: TransferManager = TransferManagerBuilder.standard.build
try {

  for (file <- s3.listObjects(bucketName, s"$srcFolder/").getObjectSummaries) {
    val fileName = file.getKey.replace(s"$srcFolder/", "")
    if (!fileName.isEmpty) {
      val transferProcess: Copy = transferManager.copy(bucketName, file.getKey,
        bucketName, s"$targetFolder/$fileName")
      log.info(s"Old key = ${file.getKey}")
      log.info(s"New file Key = $targetFolder/$fileName")
      transferProcess.waitForCompletion()
    }
  }
} catch {
  case e: AmazonServiceException =>
    log.error(e.getErrorMessage, e)
    System.exit(1)
  case e: AmazonClientException =>
    log.error("Amazon client error: " + e.getMessage, e)
    System.exit(1)
  case e: InterruptedException =>
    log.error("Transfer interrupted: " + e.getMessage, e)
    System.exit(1)
}
}

Usage:

copyFolders("mybucket", "somefolder/srcfolder", "somefolder/targetfolder")
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!