s3cmd

Can I move an object into a 'folder' inside an S3 bucket using the s3cmd mv command?

余生长醉 提交于 2019-12-22 04:38:16
问题 I have the s3cmd command line tool for linux installed. It works fine to put files in a bucket. However, I want to move a file into a 'folder'. I know that folders aren't natively supported by S3, but my Cyberduck GUI tool converts them nicely for me to view my backups. For instance, I have a file in the root of the bucket, called 'test.mov' that I want to move to the 'idea' folder. I am trying this: s3cmd mv s3://mybucket/test.mov s3://mybucket/idea/test.mov but I get strange errors like:

Error uploading small files to s3 using s3cmd?

爷,独闯天下 提交于 2019-12-20 04:50:55
问题 I am having an unusual error: my files appear to be too small to be uploaded to s3! I have a small log file which is not uploading: s3cmd put log.txt s3://MY-BUCKET/MY-SUB-BUCKET/ ERROR: S3 error: Access Denied But when I do this: yes | head -n 10000000 >> log.txt s3cmd put log.txt s3://MY-BUCKET/MY-SUB-BUCKET/ # this works for some reason. The magic number appears to be 15MB, the point at which s3cmd starts doing multipart uploads. 回答1: I was running into this same issue and apparently the

Granting read access to the Authenticated Users group for a file

ぃ、小莉子 提交于 2019-12-12 07:46:17
问题 How do I grant read access to the Authenticated Users group for a file? I'm using s3cmd and want to do it while uploading but I'm just focusing directly on changing the acl. What should I put in for http://acs.amazonaws.com/groups/global/AuthenticatedUsers? I have tried every combination of AuthenticatedUsers possible. ./s3cmd setacl --acl-grant=read:http://acs.amazonaws.com/groups/global/AuthenticatedUsers s3://BUCKET/FILE ./s3cmd setacl --acl-grant=read:AuthenticatedUsers s3://BUCKET/FILE

Issues with s4cmd

时光怂恿深爱的人放手 提交于 2019-12-12 05:05:11
问题 I have about 50GB data to upload to S3 bucket but s3cmd is unreliable and very slow. the sync doesn't seem to work because of the timeout error. I switched to s4cmd it works great, multi threaded and fast. s4cmd dsync -r -t 1000 --ignore-empty-source forms/ s3://bucket/J/M/ The above uploads a set of files and then throws error - [Thread Failure] Unable to read data from source: /home/ubuntu/path to file The source file contains an image file so there is nothing wrong there. s4cmd has options

How do you pipe the result of s3cmd get to a var?

左心房为你撑大大i 提交于 2019-12-08 06:45:18
问题 I need to get an image from S3, process it on an ec2, and save it to a different folder on S3. I would like to not save the file to ec2 during the process. Is it possible to pipe the result of "s3cmd get s3://bucket/image" to a var? It does not seem to print to standard output? 回答1: It would be better if you could show which output produce s3cmd in your case. But if you really want to save the output of the command to a variable you do it this way: $ a=$(s3cmd get s3://bucket/image) When you

Automatically sync two Amazon S3 buckets, besides s3cmd?

こ雲淡風輕ζ 提交于 2019-12-04 12:14:06
问题 Is there a another automated way of syncing two Amazon S3 bucket besides using s3cmd? Maybe Amazon has this as an option? The environment is linux, and every day I would like to sync new & deleted files to another bucket. I hate the thought of keeping all eggs in one basket. 回答1: You could use the standard Amazon CLI to make the sync. You just have to do something like: aws s3 sync s3://bucket1/folder1 s3://bucket2/folder2 http://aws.amazon.com/cli/ 回答2: I'm looking for something similar and

Exclude multiple folders using AWS S3 sync

谁说我不能喝 提交于 2019-12-03 14:25:51
问题 How to exclude multiple folders while using aws s3 syn ? I tried : # aws s3 sync s3://inksedge-app-file-storage-bucket-prod-env s3://inksedge-app-file-storage-bucket-test-env --exclude 'reportTemplate/* orders/* customers/*' But still it's doing sync for folder "customer" Output : copy: s3://inksedge-app-file-storage-bucket-prod-env/customers/116/miniimages/IMG_4800.jpg to s3://inksedge-app-file-storage-bucket-test-env/customers/116/miniimages/IMG_4800.jpg copy: s3://inksedge-app-file-storage

Granting read access to the Authenticated Users group for a file

老子叫甜甜 提交于 2019-12-03 12:19:15
How do I grant read access to the Authenticated Users group for a file? I'm using s3cmd and want to do it while uploading but I'm just focusing directly on changing the acl. What should I put in for http://acs.amazonaws.com/groups/global/AuthenticatedUsers ? I have tried every combination of AuthenticatedUsers possible. ./s3cmd setacl --acl-grant=read:http://acs.amazonaws.com/groups/global/AuthenticatedUsers s3://BUCKET/FILE ./s3cmd setacl --acl-grant=read:AuthenticatedUsers s3://BUCKET/FILE This doesn't seem to be possible with s3cmd. Instead I had to switch to the aws cli tools. Here are the

Exclude multiple folders using AWS S3 sync

前提是你 提交于 2019-12-03 10:23:32
How to exclude multiple folders while using aws s3 syn ? I tried : # aws s3 sync s3://inksedge-app-file-storage-bucket-prod-env s3://inksedge-app-file-storage-bucket-test-env --exclude 'reportTemplate/* orders/* customers/*' But still it's doing sync for folder "customer" Output : copy: s3://inksedge-app-file-storage-bucket-prod-env/customers/116/miniimages/IMG_4800.jpg to s3://inksedge-app-file-storage-bucket-test-env/customers/116/miniimages/IMG_4800.jpg copy: s3://inksedge-app-file-storage-bucket-prod-env/customers/116/miniimages/DSC_0358.JPG to s3://inksedge-app-file-storage-bucket-test

Uploading files to s3 using s3cmd in parallel

*爱你&永不变心* 提交于 2019-12-03 09:32:03
问题 I've got a whole heap of files on a server, and I want to upload these onto S3. The files are stored with a .data extension, but really they're just a bunch of jpegs,pngs,zips or pdfs. I've already written a short script which finds the mime type and uploads them onto S3 and that works but it's slow. Is there any way to make the below run using gnu parallel? #!/bin/bash for n in $(find -name "*.data") do data=".data" extension=`file $n | cut -d ' ' -f2 | awk '{print tolower($0)}'` mimetype=