How to scp to Amazon s3?

馋奶兔 提交于 2019-12-12 07:31:20

问题


I need to send backup files of ~2TB to S3. I guess the most hassle-free option would be Linux scp command (have difficulty with s3cmd and don't want an overkill java/RoR to do so).

However I am not sure whether it is possible: How to use S3's private and public keys with scp, and don't know what would be my destination IP/url/path?

I appreciate your hints.


回答1:


You can't SCP.

The quickest way, if you don't mind spending money, is probably just to send it to them on a disk and they'll put it up there for you. See their Import/Export service.




回答2:


As of 2015, SCP/SSH is not supported (and probably never will be for the reasons mentioned in the other answers).

Official AWS tools for copying files to/from S3

  1. command line tool (pip3 install awscli) - note credentials need to be specified, I prefer via environment variables rather than a file: AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY.

    aws s3 cp /tmp/foo/ s3://bucket/ --recursive --exclude "*" --include "*.jpg"
    
    • http://docs.aws.amazon.com/cli/latest/reference/s3/index.html

    and an rsync-like command:

    aws s3 sync . s3://mybucket
    
    • http://docs.aws.amazon.com/cli/latest/reference/s3/sync.html
  2. Web interface:

    • https://console.aws.amazon.com/s3/home?region=us-east-1

Non-AWS methods

Any other solutions depend on third-party executables (e.g. botosync, jungledisk...) which can be great as long as they are supported. But third party tools come and go as years go by and your scripts will have a shorter shelf life.

  • https://github.com/ncw/rclone

EDIT: Actually, AWS CLI is based on botocore:

https://github.com/boto/botocore

So botosync deserves a bit more respect as an elder statesman than I perhaps gave it.




回答3:


Here's just the thing for this, boto-rsync. From any Linux box, install boto-rsync and then use this to transfer /local/path/ to your_bucket/remote/path/:

boto-rsync -a your_access_key -s your_secret_key /local/path/ s3://your_bucket/remote/path/

The paths can also be files.

For a S3-compatible provider other than AWS, use --endpoint:

boto-rsync -a your_access_key -s your_secret_key --endpoint some.provider.com /local/path/ s3://your_bucket/remote/path/



回答4:


Why don't you scp it to an EBS volume and then use s3cmd from there? As long as your EBS volume and s3 bucket are in the same region, you'll only be charged for inbound data charges once (from your network to the EBS volume)

I've found that once within the s3 network, s3cmd is much more reliable and the data transfer rate is far higher than direct to s3.




回答5:


There is an amazing tool called Dragon Disk. It works as a sync tool even and not just as plain scp.

http://www.s3-client.com/

The Guide to setup the amazon s3 is provided here and after setting it up you can either copy paste the files from your local machine to s3 or setup an automatic sync. The User Interface is very similiar to WinSCP or Filezilla.




回答6:


Here you go,

scp USER@REMOTE_IP:/FILE_PATH >(aws s3 cp - s3://BUCKET/SAVE_FILE_AS_THIS_NAME)



回答7:


for our AWS backups we use a combination of duplicity and trickle duplicity for rsync and encryption and trickle to limit the upload speed



来源:https://stackoverflow.com/questions/7328849/how-to-scp-to-amazon-s3

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!