I am using AWS ec2 instance. On this instance I\'m resulting some files. These operations are done by user data.
Now I want to store those files on s3 by writing co
I'm using s3cmd to store nightly exported database backup files from my ec2 instance. After configuration of s3cmd, which you can read about at their site, you can then run a command like:
s3cmd put ./myfile s3://mybucket
I think the best answer in general is in fact above, to use the aws
command, but for the cases where you don't want to bother installing anything else, it's also worth mentioning that you can just download the file over HTTPS, e.g. open a browser and navigate to:
https://s3.amazonaws.com/
(bucketName)/
(relativePath)/
(fileName)
That also means you could just use wget
or curl
to do transfer from shell prompts.
Using the most recent AWS CLI (http://aws.amazon.com/cli/) you can use the following commands to copy files from your Ec2 Instance or even you local machine to S3 storage.
aws s3 cp myfolder s3://mybucket/myfolder --recursive
You'll then get something like:
upload: myfolder/file1.txt to s3://mybucket/myfolder/file1.txt
upload: myfolder/subfolder/file1.txt to s3://mybucket/myfolder/subfolder/file1.txt
If this is your first usage of the aws
CLI tool then you'll need to run:
aws configure
This will ask you to enter your access key & secret along with specifying a default region.