boto

How to change metadata on an object in Amazon S3

回眸只為那壹抹淺笑 提交于 2019-12-18 10:43:06
问题 If you have already uploaded an object to an Amazon S3 bucket, how do you change the metadata using the API? It is possible to do this in the AWS Management Console, but it is not clear how it could be done programmatically. Specifically, I'm using the boto API in Python and from reading the source it is clear that using key.set_metadata only works before the object is created as it just effects a local dictionary. 回答1: It appears you need to overwrite the object with itself, using a "PUT

Why are no Amazon S3 authentication handlers ready?

我是研究僧i 提交于 2019-12-17 23:42:18
问题 I have my $AWS_ACCESS_KEY_ID and $AWS_SECRET_ACCESS_KEY environment variables set properly, and I run this code: import boto conn = boto.connect_s3() and get this error: boto.exception.NoAuthHandlerFound: No handler was ready to authenticate. 1 handlers were checked. ['HmacAuthV1Handler'] What's happening? I don't know where to start debugging. It seems boto isn't taking the values from my environment variables. If I pass in the key id and secret key as arguments to the connection constructor

Unable to connect aws s3 bucket using boto

纵饮孤独 提交于 2019-12-17 23:13:52
问题 AWS_ACCESS_KEY_ID = '<access key>' AWS_SECRET_ACCESS_KEY = '<my secret key>' Bucketname = 'Bucket-name' import boto from boto.s3.key import Key import boto.s3.connection conn = boto.connect_s3(AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY, host ='s3.ap-southeast-1.amazonaws.com', is_secure=True, # uncommmnt if you are not using ssl calling_format = boto.s3.connection.OrdinaryCallingFormat(), ) bucket = conn.get_bucket(Bucketname) Error: Traceback (most recent call last): File "uploads3.py", line 69

Can I use boto3 anonymously?

匆匆过客 提交于 2019-12-17 18:52:27
问题 With boto I could connect to public S3 buckets without credentials by passing the anon= keyword argument. s3 = boto.connect_s3(anon=True) Is this possible with boto3 ? 回答1: Yes. Your credentials are used to sign all the requests you send out, so what you have to do is configure the client to not perform the signing step at all. You can do that as follows: import boto3 from botocore import UNSIGNED from botocore.client import Config s3 = boto3.client('s3', config=Config(signature_version

Boto Execute shell command on ec2 instance

[亡魂溺海] 提交于 2019-12-17 17:58:53
问题 I am newbie to EC2 and boto. I have an EC2 running instance and I want to execute a shell command like e.g. apt-get update through boto. I searched a lot and found a solution using user_data in the run_instances command, but what if the instance is already launched? I don't even know if it is possible. Any clue in this reference will be a great help. 回答1: The boto.manage.cmdshell module can be used to do this. To use it, you must have the paramiko package installed. A simple example of it's

Downloading the files from s3 recursively using boto python.

社会主义新天地 提交于 2019-12-17 16:04:34
问题 I have a bucket in s3, which has deep directory structure. I wish I could download them all at once. My files look like this : foo/bar/1. . foo/bar/100 . . Are there any ways to download these files recursively from the s3 bucket using boto lib in python? Thanks in advance. 回答1: You can download all files in a bucket like this (untested): from boto.s3.connection import S3Connection conn = S3Connection('your-access-key','your-secret-key') bucket = conn.get_bucket('bucket') for key in bucket

boto.exception.S3ResponseError: S3ResponseError: 403 Forbidden

本秂侑毒 提交于 2019-12-17 07:09:15
问题 I'm trying to get django to upload static files to S3, but istead I'm getting a 403 forbidden error, and I'm not sure why. Full Stacktrace: Traceback (most recent call last): File "manage.py", line 14, in <module> execute_manager(settings) File "/home/levi/Projects/DoneBox/.virtualenv/local/lib/python2.7/site-packages/django/core/management/__init__.py", line 438, in execute_manager utility.execute() File "/home/levi/Projects/DoneBox/.virtualenv/local/lib/python2.7/site-packages/django/core

boto.exception.S3ResponseError: S3ResponseError: 403 Forbidden

守給你的承諾、 提交于 2019-12-17 07:09:09
问题 I'm trying to get django to upload static files to S3, but istead I'm getting a 403 forbidden error, and I'm not sure why. Full Stacktrace: Traceback (most recent call last): File "manage.py", line 14, in <module> execute_manager(settings) File "/home/levi/Projects/DoneBox/.virtualenv/local/lib/python2.7/site-packages/django/core/management/__init__.py", line 438, in execute_manager utility.execute() File "/home/levi/Projects/DoneBox/.virtualenv/local/lib/python2.7/site-packages/django/core

How to save S3 object to a file using boto3

女生的网名这么多〃 提交于 2019-12-17 05:38:07
问题 I'm trying to do a "hello world" with new boto3 client for AWS. The use-case I have is fairly simple: get object from S3 and save it to the file. In boto 2.X I would do it like this: import boto key = boto.connect_s3().get_bucket('foo').get_key('foo') key.get_contents_to_filename('/tmp/foo') In boto 3 . I can't find a clean way to do the same thing, so I'm manually iterating over the "Streaming" object: import boto3 key = boto3.resource('s3').Object('fooo', 'docker/my-image.tar.gz').get()

How to upload a file to directory in S3 bucket using boto

情到浓时终转凉″ 提交于 2019-12-17 04:44:32
问题 I want to copy a file in s3 bucket using python. Ex : I have bucket name = test. And in the bucket, I have 2 folders name "dump" & "input". Now I want to copy a file from local directory to S3 "dump" folder using python... Can anyone help me? 回答1: Try this... import boto import boto.s3 import sys from boto.s3.key import Key AWS_ACCESS_KEY_ID = '' AWS_SECRET_ACCESS_KEY = '' bucket_name = AWS_ACCESS_KEY_ID.lower() + '-dump' conn = boto.connect_s3(AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY) bucket