Laravel 5: How do you copy a local file to Amazon S3?

久未见 提交于 2019-12-21 16:18:19

问题


I'm writing code in Laravel 5 to periodically backup a MySQL database. My code thus far looks like this:

    $filename = 'database_backup_'.date('G_a_m_d_y').'.sql';
    $destination = storage_path() . '/backups/';

    $database = \Config::get('database.connections.mysql.database');
    $username = \Config::get('database.connections.mysql.username');
    $password = \Config::get('database.connections.mysql.password');

    $sql = "mysqldump $database --password=$password --user=$username --single-transaction >$destination" . $filename;

    $result = exec($sql, $output); // TODO: check $result

    // Copy database dump to S3

    $disk = \Storage::disk('s3');

    // ????????????????????????????????
    //  What goes here?
    // ????????????????????????????????

I've seen solutions online that would suggest I do something like:

$disk->put('my/bucket/' . $filename, file_get_contents($destination . $filename));

However, for large files, isn't it wasteful to use file_get_contents()? Are there any better solutions?


回答1:


Looking at the documentation the only way is using method put which needs file content. There is no method to copy file between 2 file systems so probably the solution you gave is at the moment the only one.

If you think about it, finally when copying file from local file system to s3, you need to have file content to put it in S3, so indeed it's not so wasteful in my opinion.




回答2:


There is a way to copy files without needing to load the file contents into memory.

You will also need to import the following:

use League\Flysystem\MountManager;

Now you can copy the file like so:

$mountManager = new MountManager([
    's3' => \Storage::disk('s3')->getDriver(),
    'local' => \Storage::disk('local')->getDriver(),
]);
$mountManager->copy('s3://path/to/file.txt', 'local://path/to/output/file.txt');



回答3:


You can always use a file resource to stream the file (advisable for large files) by doing something like this:

Storage::disk('s3')->put('my/bucket/' . $filename, fopen('path/to/local/file', 'r+'));

An alternative suggestion is proposed here. It uses Laravel's Storage facade to read the stream. The basic idea is something like this:

    $inputStream = Storage::disk('local')->getDriver()->readStream('/path/to/file');
    $destination = Storage::disk('s3')->getDriver()->getAdapter()->getPathPrefix().'/my/bucket/';
    Storage::disk('s3')->getDriver()->putStream($destination, $inputStream);



回答4:


You can try this code

$contents = Storage::get($file);
Storage::disk('s3')->put($newfile,$contents);

As Laravel document this is the easy way I found to copy data between two disks




回答5:


I solved it in the following way:

$contents = \File::get($destination);
\Storage::disk('s3')
    ->put($s3Destination,$contents);

Sometimes we don't get the data using $contents = Storage::get($file); - storage function so we have to give root path of the data using Laravel File instead of storage path using Storage.



来源:https://stackoverflow.com/questions/29527611/laravel-5-how-do-you-copy-a-local-file-to-amazon-s3

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!