boto

how to install custom packages on amazon EMR bootstrap action in code?

我只是一个虾纸丫 提交于 2019-12-07 05:19:34
问题 need to install some packages and binaries on the amazon EMR bootstrap action but I can't find any example that uses this. Basically, I want to install python package, and specify each hadoop node to use this package for processing the items in s3 bucket, here's a sample frpm boto. name='Image to grayscale using SimpleCV python package', mapper='s3n://elasticmapreduce/samples/imageGrayScale.py', reducer='aggregate', input='s3n://elasticmapreduce/samples/input', output='s3n://<my output bucket

Running Boto on Google App Engine (GAE)

纵然是瞬间 提交于 2019-12-07 03:30:55
问题 I'm new to Python and was hoping for help on how to 'import boto.ec2' on a GAE Python application to control Amazon EC2 instances. I'm using PyDev/Eclipse and have installed boto on my Mac, but using simply 'import boto' does not work (I get: : No module named boto.ec2). I've read that boto is supported on GAE but I haven't been able to find instructions anywhere. Thanks! 回答1: It sounds like you haven't copied the boto code to the root of your app engine directory. Boto works with GAE but

Django - Getting PIL Image save method to work with Amazon s3boto Storage

自古美人都是妖i 提交于 2019-12-07 02:53:20
问题 In order to resize images upon upload (using PIL), I'm overriding the save method for my Article model like so: def save(self): super(Article, self).save() if self.image: size = (160, 160) image = Image.open(self.image) image.thumbnail(size, Image.ANTIALIAS) image.save(self.image.path) This works locally but in production I get an error: NotImplementedError: This backend doesn't support absolute paths. I tried replacing the image.save line with image.save(self.image.url) but then I get an

Django, Heroku, boto: direct file upload to Google cloud storage

帅比萌擦擦* 提交于 2019-12-07 01:43:20
问题 In Django projects deployed on Heroku, I used to upload files to Google cloud storage via boto. However, recently I have to upload large files which will cause Heroku timeout. I am following Heroku's documentation about direct file upload to S3, and customizing as follows: Python: conn = boto.connect_gs(gs_access_key_id=GS_ACCESS_KEY, gs_secret_access_key=GS_SECRET_KEY) presignedUrl = conn.generate_url(expires_in=3600, method='PUT', bucket=<bucketName>, key=<fileName>, force_http=True) JS:

Check if python script is running on an aws instance

偶尔善良 提交于 2019-12-07 01:40:38
问题 I'm trying to set up a python logger to send error emails when it logs an error iff the instance has a tag set. I then quickly run into the problem of local dev computers that aren't on aws. Is there an easy and fast way to check if the script is being run on aws? I was loading the instance data with: import boto.utils from boto.ec2.connection import EC2Connection metadata = boto.utils.get_instance_metadata() conn = EC2Connection() instance = conn.get_only_instances(instance_ids=metadata[

How to (properly) use external credentials in an AWS Lambda function?

两盒软妹~` 提交于 2019-12-07 00:23:06
问题 I have a (extremely basic but perfectly working) AWS lambda function written in Python that however has embedded credentials to connect to: 1) an external web service 2) a DynamoDB table. What the function does is fairly basic: it POSTs a login against a service (with credentials #1) and then saves part of the response status into a DynamoDB table (with AWS credentials #2). These are the relevant parts of the function: h = httplib2.Http() auth = base64.encodestring('myuser' + ':' +

Django collecstatic boto broken pipe on large file upload

ぐ巨炮叔叔 提交于 2019-12-06 22:01:07
问题 I am trying to upload the static files to my S3 bucket with collectstatic but i'm getting a broken pipe error with a 700k javascript file, this is the error Copying '/Users/wedonia/work/asociados/server/asociados/apps/panel/static/panel/js/js.min.js' Traceback (most recent call last): File "manage.py", line 10, in <module> execute_from_command_line(sys.argv) File "/Users/wedonia/work/asociados/server/envs/asociados/lib/python2.7/site-packages/django/core/management/__init__.py", line 399, in

Amazon EC2 AutoScaling CPUUtilization Alarm- INSUFFICIENT DATA

拜拜、爱过 提交于 2019-12-06 20:58:06
问题 So I've been using Boto in Python to try and configure autoscaling based on CPUUtilization, more or less exactly as specified in this example: http://boto.readthedocs.org/en/latest/autoscale_tut.html However both alarms in CloudWatch just report: State Details: State changed to 'INSUFFICIENT_DATA' at 2012/11/12 16:30 UTC. Reason: Unchecked: Initial alarm creation Auto scaling is working fine but the alarms aren't picking up any CPUUtilization data at all. Any ideas for things I can try? Edit:

Unable to install boto in python3

微笑、不失礼 提交于 2019-12-06 19:54:47
问题 I am trying to install boto from the source code / pypi, but I am unable to install it using python 3.2. Why is it failing? c:\boto>..\Python32\python.exe setup.py install Traceback (most recent call last): File "setup.py", line 35, in <module> from boto import __version__ File "c:\boto\boto\__init__.py", line 26, in <mod ule> from boto.pyami.config import Config, BotoConfigLocations File "c:\boto\boto\pyami\config.py", line 185 print s.getvalue() ^ SyntaxError: invalid syntax 回答1: print s

PyCharm intellisense for boto3

十年热恋 提交于 2019-12-06 17:17:34
问题 having problems seeing full intellisense (code completion) options in PyCharm. working with python 3.4 on Windows. the suggests are partially working: import boto3 s = boto3.Session() (boto3. will bring up list of methods/params of object boto3) ec2 = s.resource('ec2') (resource is a suggested method!) ec2. <<<< this brings up nothing. For some reason PyCharm cant detect that ec2 object would have while I can work off documentation alone, intellisense is just such a nice feature to have! ive