boto

Image file cut off when uploading to AWS S3 bucket via Django and Boto3

那年仲夏 提交于 2019-12-12 22:55:35
问题 When I upload a larger image (3+ MB) to an AWS S3 bucket, only part of the image is being being saved to the bucket (about the top 10% of the image, the rest displaying as grey space). These images consistently show 256 KB size. There isn't any issue with smaller files. Here's my code: s3 = boto3.resource('s3') s3.Bucket(settings.AWS_MEDIA_BUCKET_NAME).put_object(Key=fname, Body=data) ...where data is binary data of image file. No issues when files are smaller size, and in the S3 bucket the

Boto Credential Error with Python on Windows

☆樱花仙子☆ 提交于 2019-12-12 15:16:27
问题 I have been working on trying to sign in on Boto via python for the last few hours and can't seem to solve the problem. Python keep returning the error that: No handler was ready to authenticate. 1 handlers were checked. ['HmacAuthV1Handler'] Check your Credentials According to the logger: boto.set_stream_logger('boto') The problem is: "[DEBUG]: Retrieving credentials from metadata server." This must mean my credentials file cannot be found, and while I am not sure exactly where to place my

Boto Ec2 and elastic IP's

[亡魂溺海] 提交于 2019-12-12 14:44:06
问题 Is it possible to associate an an elastic IP address with an ec2 instance using python boto? I'm trying to automate a deploy. I searched the api documentation in the ec2 section and found nothing. 回答1: Don't know what documentation you were looking at, but it's in there: http://boto.readthedocs.org/en/latest/ref/ec2.html#boto.ec2.address.Address.associate associate(instance_id=None, network_interface_id=None, private_ip_address=None, allow_reassociation=False, dry_run=False) Associate this

With the boto library, can I avoid granting list permissions on a base bucket in S3?

可紊 提交于 2019-12-12 12:28:15
问题 I currently have an IAM role that has a policy like so: { "Version":"2008-10-17", "Statement": [ { "Effect":"Allow", "Action":["s3:ListBucket"], "Resource":[ "arn:aws:s3:::blah.example.com" ] }, { "Effect":"Allow", "Action":["s3:GetObject", "s3:GetObjectAcl", "s3:ListBucket", "s3:PutObject", "s3:PutObjectAcl", "s3:DeleteObject"], "Resource":[ "arn:aws:s3:::blah.example.com/prefix/" ] } ] } Boto seems to require the ListBucket permission be present on the root of the bucket to do the get

Boto3: Wait for S3 streaming upload to complete

北城余情 提交于 2019-12-12 12:02:28
问题 I'm using S3.Client.upload_fileobj() with a BytesIO stream as input to upload a file to S3 from a stream. My function should not return before the upload is finished, so I need a way to wait it. From the documentation there is no obvious way to wait for the transfer to finish, but there are some hints of what could work: Use the callback arg to wait until progress is at 100%. In Javascript this would be trivial using callbacks or promises, but in Python I'm not so sure. Use a S3.Waiter object

Launch Openstack Instances using python-boto

旧城冷巷雨未停 提交于 2019-12-12 10:44:46
问题 I am trying to launch instances on opensatck setup with multiple networks configured using python-boto. But I got following error, EC2ResponseError: EC2ResponseError: 400 Bad Request <?xml version="1.0"?> <Response><Errors><Error><Code>NetworkAmbiguous</Code><Message>Multiple possible networks found, use a Network ID to be more specific.</Message></Error></Errors><RequestID>req-28b5a4e8-3838-4111-95db-337c5048716d</RequestID></Response> My code is like here, from boto import ec2 ostack = ec2

Release a message back to SQS

半腔热情 提交于 2019-12-12 10:29:13
问题 I have a some EC2 servers pulling work off of a SQS queue. Occasionally, they encounter a situation where the can't finish the job. I have the process email me of the condition. As it stands now, the message stays "in flight" until it times out. I would like for the process to immediately release it back to the queue after the email is sent. But, I'm not sure how to accomplish this. Is there a way? If so, can you please point me to the call or post a code snippet. I'm using Python 2.7.3 and

Migrating from Amazon S3 to Azure Storage (Django web app)

两盒软妹~` 提交于 2019-12-12 09:06:09
问题 I maintain this Django web app where users congregate and chat with one another. They can post pictures too if they want. I process these photos (i.e. optimize their size) and store them on an Amazon S3 bucket (like a 'container' in Azure Storage). To do that, I set up the bucket on Amazon, and included the following configuration code in my settings.py : DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage' AWS_S3_FORCE_HTTP_URL = True AWS_QUERYSTRING_AUTH = False AWS_SECRET_ACCESS

How to count files inside zip in AWS S3 without downloading it?

拟墨画扇 提交于 2019-12-12 08:59:39
问题 Case: There is a large zip file in an S3 bucket which contains a large number of images. Is there a way without downloading the whole file to read the metadata or something to know how many files are inside the zip file? When the file is local, in python i can just open it as a zipfile() and then I call the namelist() method which returns a list of all the files inside, and I can count that. However not sure how to do this when the file resides in S3 without having to download it. Also if

How to create a s3 bucket using Boto3?

前提是你 提交于 2019-12-12 08:31:54
问题 I want to enable cloudtrail logs for my account and so need to create an s3 bucket.I wanted to automate this task using Boto3.Currently I am using the following script sess = Session(aws_access_key_id=tmp_access_key, aws_secret_access_key=tmp_secret_key, aws_session_token=security_token) s3_conn_boto3 = sess.client(service_name='s3', region_name=region) bucket = s3_conn_boto3.create_bucket(Bucket=access_log_bucket_name, CreateBucketConfiguration={'LocationConstraint':'us-east-1'}, ACL=