boto

dynamodb boto put_item of type Map “M”

六眼飞鱼酱① 提交于 2019-12-11 08:32:16
问题 Has anyone successfully performed a put operation of a map into dynamodb using boto (python)? I basically need to put a json object. So far I have only been able to put it as json string but I cannot find an example of inserting a map anywhere. Thanks a lot. 回答1: Since it does not looks like boto supports JSON in its high-level API interface, you have to use the low-level API interface and annotate your JSON object into a DynamoDB-supported wire format as such: "time": { "M": { "creation

How do I set an alarm to terminate an EC2 instance using boto?

谁都会走 提交于 2019-12-11 08:23:33
问题 I have been unable to find a simple example which shows me how to use boto to terminate an Amazon EC2 instance using an alarm (without using AutoScaling). I want to terminate the specific instance that has a CPU usage less than 1% for 10 minutes. Here is what I've tried so far: import boto.ec2 import boto.ec2.cloudwatch from boto.ec2.cloudwatch import MetricAlarm conn = boto.ec2.connect_to_region("us-east-1", aws_access_key_id=ACCESS_KEY, aws_secret_access_key=SECRET_KEY) cw = boto.ec2

Heroku + S3 + Django: Static Files not Cached

☆樱花仙子☆ 提交于 2019-12-11 07:35:43
问题 Currently have a project deployed on Heroku with static files loaded from S3. I'm using boto/django-storage to manage my S3 content, but if I call the same view or load the same page repeatedly, all the images/static content load twice and is not cached. I've placed AWS_HEADERS = { 'Cache-Control': 'max-age=2592000', } in my settings.py , but the reason seems the same exact images (refreshed + loaded twice) have different signatures in their URL? I've tried multiples headers, but the browser

Faster way to make S3 “folder hierarchy” than parsing of filenames?

烈酒焚心 提交于 2019-12-11 06:59:25
问题 I want to make a relatively basic tool to browse a bucket in S3 as a file hierarchy rather than simply a list of filenames with slashes in them. Currently, I am using boto to get the list of keynames in a bucket and then parsing the keynames to make a nested dictionary of the "folders" and files. However, that process takes so long! Even just going through each key to get a list of all higher level folders takes 15+ minutes. How do tools such as cyberduck give a list of folders so quickly?

S3ResponseError: 403 Forbidden using boto

你离开我真会死。 提交于 2019-12-11 06:29:08
问题 I have a script that copy files from one S3 account to another S3 account, It was working befoure!!!! That's for sure. Than I tried it today and it doesn't any more it gives me error S3ResponseError: 403 Forbidden . I'm 100% sure credentials are correct and I can go and download keys from both accounts manualy using aws console. Code def run(self): while True: # Remove and return an item from the queue key_name = self.q.get() k = Key(self.s_bucket, key_name) d_key = Key(self.d_bucket, k.key)

How to tell if my AWS account “owns” a given IP address in Python boto

匆匆过客 提交于 2019-12-11 04:24:22
问题 So I have a list of public IP addresses and I'd like to see if they are a public IP that is associated with our account. I know that I can simply paste each IP into the search box in the AWS EC2 console. However I would like to automate this process via a Python program. I'm told anything you can do in the console, you can do via CLI/program, but which function do I use to simply either return a result or not, based on whether it's a public IP that's associated with our account? I understand

Query dynamoDB with non hash key field (with boto / python)

倖福魔咒の 提交于 2019-12-11 04:19:10
问题 I'm using dynamoDB with boto, and having a bit of a problem in the design/query of my table. I'd like my data to look something like +---------------------------------------+ hash_key account_id mykey ----------------------------------------- 1 12345 myvalue1 2 12345 myvalue2 3 12345 myvalue3 4 123456 myvalue4 +---------------------------------------+ And then retrieve all data for account 12345. Looking at the boto docs, I always need to have the hash_key available. I know how I would query

Search files(key) in s3 bucket takes longer time

跟風遠走 提交于 2019-12-11 04:17:56
问题 I have 10000 files in a s3 bucket.When I list all the files it takes 10 minutes. I want to implement a search module using BOTO (Python interface to AWS) which searches files based on user input. Is there a way I can search specific files with less time? 回答1: AFAIK the best you can do is filter the results based on a file prefix using the prefix named parameter. 回答2: There are two ways to implement the search... Case 1 . As suggested by john - you can specify the prefix of the s3 key file in

gsutil not working in GCE

落爺英雄遲暮 提交于 2019-12-11 03:57:24
问题 So when I bring up a GCE instance using the standard debian 7 image, and issue a "gsutil config" command, it fails with the following message: jcortez@master:~$ gsutil config Failure: No handler was ready to authenticate. 4 handlers were checked. ['ComputeAuth', 'OAuth2Auth', 'OAuth2ServiceAccountAuth', 'HmacAuthV1Handler'] Check your credentials. I've tried it on the debian 6 and centos instances and had the same results. Issuing "gcutil config" works fine however. I gather I need to set up

Send Raw Email (with attachment) to Multiple Recipients

别来无恙 提交于 2019-12-11 03:57:12
问题 I am currently using Python 2.7 and trying to send a raw email with an attachment (CSV to be exact) to multiple addresses with Boto SES. I can send a normal email with send_email() , but I keep getting an error when trying to send to multiple people via send_raw_email() . This is the error that I get with a comma-separated string of recipients: Error sending email: SESIllegalAddressError: 400 Illegal address <ErrorResponse xmlns="http://ses.amazonaws.com/doc/2010-12-01/"> <Error> <Type>Sender