amazon-s3

Retrieve/List objects using metadata in s3 - aws sdk

耗尽温柔 提交于 2020-01-23 10:03:42
问题 I have used User-Defined Metadata data to store the file in S3 bucket. Lets say my meta data would be like metaData = { "title": "some random user title", "description": "some random user description" } I understand that i can download file using the object key and the bucket name. I am looking whether there any way/options to get/retrieve/list the file by passing only the bucket name and User-Defined Metadata used for the object to upload in S3. And also to know the actual usage of User

Amazon AWS S3 SDK for iOS drops connection (Error -1005)

爱⌒轻易说出口 提交于 2020-01-23 09:29:08
问题 When running the AWSiOSDemoTVM project, the async S3 demo code will start uploading data, but the connection will be dropped after a couple of seconds. AWSiOSDemoTVM: didFailWithError : Error Domain=NSURLErrorDomain Code=-1005 "The network connection was lost."UserInfo=0xb54e850 {NSErrorFailingURLStringKey=https://BUCKETNAME.s3.amazonaws.com/asyncDemoKey, NSErrorFailingURLKey=https://BUCKETNAME.s3.amazonaws.com/asyncDemoKey, NSLocalizedDescription=The network connection was lost.,

Sync two buckets through boto3

家住魔仙堡 提交于 2020-01-23 03:27:06
问题 Is there any way to use boto3 to loop the bucket contents in two different buckets (source and target) and if it finds any key in source that does not match with target, it uploads it to the target bucket. please note I do not want to use aws s3 sync. I am currently using the following code for doing this job: import boto3 s3 = boto3.resource('s3') src = s3.Bucket('sourcenabcap') dst = s3.Bucket('destinationnabcap') objs = list(dst.objects.all()) for k in src.objects.all(): if (k.key !=objs[0

Troubles setting up Paperclip + AWS S3 for image storing in our Rails3/Heroku App

断了今生、忘了曾经 提交于 2020-01-23 00:59:05
问题 We have already built a rails app that has several users and an image for each of them. Doing all of the dev work on our localhost, we have working seeds for users & photos...but now that we are trying to use S3 for the image storage, we are running into errors during...always during the "seed" step of the migrations, when doing this: rake db:migrate:reset Apologies for the question, but we have have been banging our heads on this for 11 hours, having gone through every related Stack question

How To Upload Images to Amazon S3 Using Perl?

…衆ロ難τιáo~ 提交于 2020-01-23 00:35:47
问题 I'm trying to upload files to S3 using Perl. According to this module: http://metacpan.org/pod/Amazon::S3::Bucket ...the following code will upload text files: # create resource with meta data (attributes) my $keyname = 'testing.txt'; my $value = 'T'; $bucket->add_key( $keyname, $value, { content_type => 'text/plain', 'x-amz-meta-colour' => 'orange', } ); However, how do you upload images (GIF, JPEG, PNG) to S3? Thanks, Linda 回答1: That code won't upload the file - it's simply setting the

How To Upload Images to Amazon S3 Using Perl?

孤街醉人 提交于 2020-01-23 00:35:07
问题 I'm trying to upload files to S3 using Perl. According to this module: http://metacpan.org/pod/Amazon::S3::Bucket ...the following code will upload text files: # create resource with meta data (attributes) my $keyname = 'testing.txt'; my $value = 'T'; $bucket->add_key( $keyname, $value, { content_type => 'text/plain', 'x-amz-meta-colour' => 'orange', } ); However, how do you upload images (GIF, JPEG, PNG) to S3? Thanks, Linda 回答1: That code won't upload the file - it's simply setting the

AWS S3 and Django returns “An error occurred (AccessDenied) when calling the PutObject operation”

纵然是瞬间 提交于 2020-01-22 21:35:09
问题 I am trying to set up media and static files storage in an AWS S3 bucket, in a Django app, and am getting the following error when I try to run python manage.py collectstatic to put the static files into the bucket: botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the PutObject operation: Access Denied I am running boto3 and django storages. I have trawled through the other answers on here and tried the ideas in there first. My access key etc is correct as I can

AWS S3 and Django returns “An error occurred (AccessDenied) when calling the PutObject operation”

耗尽温柔 提交于 2020-01-22 21:34:24
问题 I am trying to set up media and static files storage in an AWS S3 bucket, in a Django app, and am getting the following error when I try to run python manage.py collectstatic to put the static files into the bucket: botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the PutObject operation: Access Denied I am running boto3 and django storages. I have trawled through the other answers on here and tried the ideas in there first. My access key etc is correct as I can

Heroku: Using external mount in local filesystem

烂漫一生 提交于 2020-01-22 21:21:52
问题 I know it's possible to mount an Amazon S3 bucket using Fuse (s3fs [or s3fsr ruby gem?]). My case is specific to Heroku. Heroku's filesystem is readonly for scalability and such, but is there a way to mount an amazon s3 in Heroku's filesystem? In my case, I use Redmine on Heroku and would like to use Redmine's built-in git repository management to link code reviews to my issues. Redmine needs to clone the repository to a local directory, which is possible but not persistent on Heroku. I would

Auto create S3 Buckets on localstack

拟墨画扇 提交于 2020-01-22 17:28:09
问题 Using localstack in my docker-compose mainly to mimic S3. I know I can create buckets, thats not the issue. What I would like to do is automatically create the buckets when I run a docker-compose up. Is there something build in already for localstack? 回答1: A change that came in with this commit since version 0.10.0 . When a container is started for the first time, it will execute files with extensions .sh that are found in /docker-entrypoint-initaws.d . Files will be executed in alphabetical