How to transfer files between AWS s3 and AWS ec2

前端 未结 9 2181
借酒劲吻你
借酒劲吻你 2020-12-23 09:34

I am using AWS ec2 instance. On this instance I\'m resulting some files. These operations are done by user data.

Now I want to store those files on s3 by writing co

相关标签:
9条回答
  • 2020-12-23 09:43

    On AWS CLI I have used the following command to copy zip file from EC2 instance to S3

    aws s3 cp file-name.zip s3://bucket-name/
    
    0 讨论(0)
  • 2020-12-23 09:48

    Use s3cmd for that:

    s3cmd get s3://AWS_S3_Bucket/dir/file
    

    See how to install s3cmd here:

    This works for me...

    0 讨论(0)
  • 2020-12-23 09:51

    All attempts to mount s3 as a pseudo filesystem are problematic. It's an object store, not a block device. If you must mount it because you have legacy code that must have local file paths, try goofys. It's about 50x faster than s3fs. https://github.com/kahing/goofys

    s3cmd is a bit long in the tooth these days. The AWS cli is a better option these days. The syntax is a bit less convenient, but it's one less tool you need to keep around.

    If you can stick to http access. It'll make you life easier in the long run.

    0 讨论(0)
  • 2020-12-23 09:51

    Im assuming you need to copy from a new instance to s3. First create a IAM role so you dont need to run aws configure and this should all work at launch time. Second Install the cli and then define your copy job using aws cli in user data. Example below for Ubuntu 18. Assign the IAM role to your instance.

    Userdata:

    #!/bin/bash
    apt-get update -y
    apt install awscli -y
    aws s3 cp *Path of data* s3://*destination bucket*  -recursive *--other options*
    

    To create an IAM role 1. Go to the IAM console at https://console.aws.amazon.com/iam/ 2. In the left pane, select Roles then click Create role. 3. For Select type of trusted entity, choose AWS service. Select EC2. Select Next: Permissions. 4. For Attach permissions policies, choose the AWS managed policies that contain the required permissions or create a custom policy. 5. Click Service Choose A service, type S3 in Find a service box, click S3, select actions (all or read + write and others you may need) 6. Click on Resources, Select Resource (you can enter all resouces or limit to specific bucket with the ARN) 7. Click Next: Review policy. Enter an Name and Description. Click Create policy. 8. Return to Create Role page, click refresh, filter policy by name you assigned, select the policy. 9. Click Next: Tags and then add any required tags 10. On the Review page, enter a name and description for the role and click Create role.

    References

    • https://aws.amazon.com/premiumsupport/knowledge-center/ec2-instance-access-s3-bucket/
    • https://aws.amazon.com/blogs/infrastructure-and-automation/amazon-s3-authenticated-bootstrapping-in-aws-cloudformation/
    • https://optimalbi.com/aws-tips-and-tricks-moving-files-from-s3-to-ec2-instance/
    0 讨论(0)
  • 2020-12-23 09:53

    Install s3cmd Package as:

    yum install s3cmd
    

    or

    sudo apt-get install s3cmd
    

    depending on your OS. Then copy data with this:

    s3cmd get s3://tecadmin/file.txt
    

    also ls can list the files.

    For more detils see this

    0 讨论(0)
  • 2020-12-23 09:57

    There are a number of ways to send files to S3. I've listed them below along with installation and documentation where relevant.

    • S3CMD: (http://s3tools.org/s3cmd) You can install this on debian/ubuntu easily via apt-get install s3cmd, then run from command line. You could incorporate this into a bash script or your program.

    • S3FS: (http://www.pophams.com/blog/howto-setups3fsonubuntu1104x64 and https://code.google.com/p/s3fs/wiki/InstallationNotes) ... This mounts an s3 bucket, so that it looks just like a local disk. It takes a little more effort to setup, but once the disk is mounted, you don't need to do anything special to get the files in your bucket.

    • If you use a CMS (lets use Drupal as an example) you may have the option of using a module to handle access to your bucket eg http://drupal.org/project/storage_api

    • Finally, you can use programming language implementations to handle all the logical yourself, for PHP you can start with this http://undesigned.org.za/2007/10/22/amazon-s3-php-class and see documentation here http://undesigned.org.za/2007/10/22/amazon-s3-php-class/documentation

    An example of the PHP implementation:

    <?php
    
        // Simple PUT:
        if (S3::putObject(S3::inputFile($file), $bucket, $uri, S3::ACL_PRIVATE)) {
            echo "File uploaded.";
        } else {
            echo "Failed to upload file.";
        }
    
    ?>
    

    An example of s3cmd:

    s3cmd put my.file s3://bucket-url/my.file
    

    Edit

    Another option worth mention is the AWS CLI http://aws.amazon.com/cli/ This is widely available, for example it's already included on AmazonLinux and can be downloaded via Python (which is installed on many systems including linux and windows).

    http://docs.aws.amazon.com/cli/latest/reference/s3/index.html

    Available commands, cp ls mb mv rb rm sync website

    http://docs.aws.amazon.com/cli/latest/reference/s3api/index.html for interacting with S3

    0 讨论(0)
提交回复
热议问题