boto

How can I check that a AWS S3 bucket exists?

别等时光非礼了梦想. 提交于 2019-12-05 12:14:34
Simple question here? ... How can I check with boto that a AWS bucket exists? ... preferably by providing the path? ... here is the approach I feel like taking: def bucket_exists(self, bucket_name): connection = boto.s3.connection.S3Connection('<aws access key>', '<aws secret key>') buckets = connection.get_all_buckets() for bucket in buckets: bucket_name = bucket.name # Bucket existence logic here # submit boto request ie:. exists = boto.get_bucket(bucket_name, validate=True) if exists: return True else: return False In the code above I am interested to find if a bucket exists amongst the

how to install custom packages on amazon EMR bootstrap action in code?

做~自己de王妃 提交于 2019-12-05 08:33:55
need to install some packages and binaries on the amazon EMR bootstrap action but I can't find any example that uses this. Basically, I want to install python package, and specify each hadoop node to use this package for processing the items in s3 bucket, here's a sample frpm boto. name='Image to grayscale using SimpleCV python package', mapper='s3n://elasticmapreduce/samples/imageGrayScale.py', reducer='aggregate', input='s3n://elasticmapreduce/samples/input', output='s3n://<my output bucket>/output' I need to make it use the SimpleCV python package, but not sure where to specify this. What

Running Boto on Google App Engine (GAE)

[亡魂溺海] 提交于 2019-12-05 07:18:53
I'm new to Python and was hoping for help on how to 'import boto.ec2' on a GAE Python application to control Amazon EC2 instances. I'm using PyDev/Eclipse and have installed boto on my Mac, but using simply 'import boto' does not work (I get: : No module named boto.ec2). I've read that boto is supported on GAE but I haven't been able to find instructions anywhere. Thanks! It sounds like you haven't copied the boto code to the root of your app engine directory. Boto works with GAE but Google doesn't supply you with the code. Once you copy it into the root of your GAE directory, the dev server

How can I add a tag to a key in boto (Amazon S3)?

一世执手 提交于 2019-12-05 06:15:11
I am trying to tag a key that I've uploaded to S3. In the same below I just create a file from a string. Once I have they key, I'm not sure how to tag the file. I've tried Tag as well as TagSet. from boto.s3.bucket import Bucket from boto.s3.key import Key from boto.s3.tagging import Tag, TagSet k = Key(bucket) k.key = 'foobar/somefilename' k.set_contents_from_string('some data in file') Tag(k, 'the_tag') As far as I can see in the docs, a setTags-method is only available on a bucket level and not on individual keys. Therefore you cannot set different tags to your uploaded file, but you would

Django, Heroku, boto: direct file upload to Google cloud storage

∥☆過路亽.° 提交于 2019-12-05 04:41:18
In Django projects deployed on Heroku, I used to upload files to Google cloud storage via boto. However, recently I have to upload large files which will cause Heroku timeout. I am following Heroku's documentation about direct file upload to S3 , and customizing as follows: Python: conn = boto.connect_gs(gs_access_key_id=GS_ACCESS_KEY, gs_secret_access_key=GS_SECRET_KEY) presignedUrl = conn.generate_url(expires_in=3600, method='PUT', bucket=<bucketName>, key=<fileName>, force_http=True) JS: url = 'https://<bucketName>.storage.googleapis.com/<fileName>?Signature=...&Expires=1471451569

How to (properly) use external credentials in an AWS Lambda function?

℡╲_俬逩灬. 提交于 2019-12-05 04:37:24
I have a (extremely basic but perfectly working) AWS lambda function written in Python that however has embedded credentials to connect to: 1) an external web service 2) a DynamoDB table. What the function does is fairly basic: it POSTs a login against a service (with credentials #1) and then saves part of the response status into a DynamoDB table (with AWS credentials #2). These are the relevant parts of the function: h = httplib2.Http() auth = base64.encodestring('myuser' + ':' + 'mysecretpassword') (response, content) = h.request('https://vca.vmware.com/api/iam/login', 'POST', headers = {

Pre-signed URLs and x-amz-acl

蹲街弑〆低调 提交于 2019-12-05 03:56:50
I want to create a so-called "pre-signed" URL for uploading a particular object (PUT) to Amazon S3 bucket. So far so good. I am using the python library boto to create an URL, that contains all necessary stuff (expires, signature and so on). The URL looks like this: https://<bucketname>.s3.amazonaws.com/<key>?Signature=<sig>&Expires=<expires>&AWSAccessKeyId=<my key id>&x-amz-acl=public-read Note the last parameter. This, at least, as I understand, limits whoever uses this URL to uploading an object to a particular key in a particular bucket and also limits the canned ACL that will be set on

Django collecstatic boto broken pipe on large file upload

好久不见. 提交于 2019-12-05 02:50:45
I am trying to upload the static files to my S3 bucket with collectstatic but i'm getting a broken pipe error with a 700k javascript file, this is the error Copying '/Users/wedonia/work/asociados/server/asociados/apps/panel/static/panel/js/js.min.js' Traceback (most recent call last): File "manage.py", line 10, in <module> execute_from_command_line(sys.argv) File "/Users/wedonia/work/asociados/server/envs/asociados/lib/python2.7/site-packages/django/core/management/__init__.py", line 399, in execute_from_command_line utility.execute() File "/Users/wedonia/work/asociados/server/envs/asociados

How do I query DynamoDB2 table by global secondary index only using boto 2.25.0?

血红的双手。 提交于 2019-12-05 02:44:31
问题 This is a continuation** of my quest to switch from regular DynamoDB tables to DynamoDB2 ones with Global Secondary Indices. So I created my table as shown here and then added the following two elements: table.put_item(data={'firstKey': 'key01', 'message': '{"firstKey":"key01", "comments": "mess 1 w/o secondKey"}'}) table.put_item(data={'firstKey': 'key02', 'secondKey':'skey01', 'message': '{"firstKey":"key02", "parentId":"skey01", "comments": "mess 2 w/ secondKey"}'}) What I want to do now

Unable to read instance data, giving up error in python boto

女生的网名这么多〃 提交于 2019-12-05 02:28:17
I am trying to access amazon s3 using boto library to access common crawl data availble in amazon 'aws-publicdatasets'. i created access config file in ~/.boto [Credentials] aws_access_key_id = "my key" aws_secret_access_key = "my_secret" and while creating connection with amazon s3 i see below error in logs. 2014-01-23 16:28:16,318 boto [DEBUG]:Retrieving credentials from metadata server. 2014-01-23 16:28:17,321 boto [ERROR]:Caught exception reading instance data Traceback (most recent call last): File "/usr/lib/python2.6/site-packages/boto-2.13.3-py2.6.egg/boto/utils.py", line 211, in retry