boto

How to batch_get_item many items at once given a list of primary partition key values

[亡魂溺海] 提交于 2019-12-10 17:23:11
问题 So, so I have a dynamodb table with a primary partition key column, foo_id and no primary sort key. I have a list of foo_id values, and want to get the observations associated with this list of ids. I figured the best way to do this (?) is to use batch_get_item() , but it's not working out for me. # python code import boto3 client = boto3.client('dynamodb') # ppk_values = list of `foo_id` values (strings) (< 100 in this example) x = client.batch_get_item( RequestItems={ 'my_table_name': {

How do I translate an AWS S3 url into a bucket name for boto?

不打扰是莪最后的温柔 提交于 2019-12-10 17:11:33
问题 I'm trying to access the http://s3.amazonaws.com/commoncrawl/parse-output/segment/ bucket with boto. I can't figure out how to translate this into a name for boto.s3.bucket.Bucket(). This is the gist of what I'm going for: s3 = boto.connect_s3() cc = boto.s3.bucket.Bucket(connection=s3, name='commoncrawl/parse-output/segment') requester = {'x-amz-request-payer':'requester'} contents = cc.list(headers=requester) for i,item in enumerate(contents): print item.__repr__() I get "boto.exception

Verify if a topic exists based on topic name

折月煮酒 提交于 2019-12-10 15:30:14
问题 I'm trying to verify if a topic exists based on topic name. Do you know if this is possible? For example I want to verify if topic with name "test" already exist. Below is what I'm trying but doesn't work because topicsList contains topicArns and not topicNames... topics = sns.get_all_topics() topicsList = topics['ListTopicsResponse']['ListTopicsResult'['Topics'] if "test" in topicsList: print("true") 回答1: This is kind of a hack but it should work: topics = sns.get_all_topics() topic_list =

boto3 how to create object with metadata?

风格不统一 提交于 2019-12-10 14:55:58
问题 In the example below I want to set a timestamp metadata attribute when created an S3 object. How do I do that? The documentation is not clear. import uuuid import json import boto3 import botocore import time from boto3.session import Session session = Session(aws_access_key_id='XXX', aws_secret_access_key='XXX') s3 = session.resource('s3') bucket = s3.Bucket('blah') for filename in glob.glob('json/*.json'): with open(filename, 'rb') as f: data = f.read().decode('utf-8') timestamp = str(round

Boto “get byte range” returns more than expected

纵饮孤独 提交于 2019-12-10 14:53:12
问题 This is my first question here as I'm fairly new to this world! I've spent a few days trying to figure this out for myself, but haven't so far been able to find any useful info. I'm trying to retrieve a byte range from a file stored in S3, using something like: S3Key.get_contents_to_file(tempfile, headers={'Range': 'bytes=0-100000'} The file that I'm trying to restore from is a video file, specifically an MXF. When I request a byte range, I get back more info in the tempfile than requested.

django-storages using boto - cannot upload mp3, but can upload an image. Also, suffering HTTP 307 pain

僤鯓⒐⒋嵵緔 提交于 2019-12-10 13:28:39
问题 Am using the boto (2.2.1) backend for django-storages (1.1.4) to upload files to an S3 bucket. It works fine for images, but when I try to upload movie files (small mov, small avi) or an mp3, i get a Broken pipe error. This is Weird. Digging into the Django traceback, I get see the following exception: boto.https_connection.InvalidCertificateException Which kind of fits the experience I've been having using Cyberduck to inspect the bucket directly: sometimes it complains that I'm getting a

connect to bucket having uppercase letter

混江龙づ霸主 提交于 2019-12-10 13:13:05
问题 I am not able to connect to a bucket if the bucket name has a Upper case letter. I have several buckets those has capital letter in it. >>> mybucket = conn.get_bucket('Vig_import') Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/lib/python2.6/site-packages/boto/s3/connection.py", line 391, in get_bucket bucket.get_all_keys(headers, maxkeys=0) File "/usr/lib/python2.6/site-packages/boto/s3/bucket.py", line 360, in get_all_keys '', headers, **params) File "/usr

Getting a list of unique hash key values from dynamodb using boto

扶醉桌前 提交于 2019-12-10 13:05:01
问题 I want to get a list of unique hash key values for a dynamodb table. The only way that I know to do it currently is to scan the entire table and then iterate over the scan. What is the better way? 回答1: rs = list(table.scan(range__eq="rangevalue")) for i in rs: print i['primarykey'] should do the trick. I'd love to hear cheaper ways to do the same thing. 来源: https://stackoverflow.com/questions/25438715/getting-a-list-of-unique-hash-key-values-from-dynamodb-using-boto

Boto SES - send_raw_email() to multiple recipients

我是研究僧i 提交于 2019-12-09 17:15:49
问题 I'm having big time problems with this issue-- another question on SO that didn't solve it is here: Send Raw Email (with attachment) to Multiple Recipients My code (that works) is simple: def send_amazon_email_with_attachment(html, subject, now, pre): dummy = 'test@example.com' recipients = ['test1@exampl.ecom', 'test2@example.com', 'test3@example.com'] connS3 = S3Connection('IDENTIFICATION','PASSWORD') b = connS3.get_bucket('BUCKET_NAME') key = b.get_key('FILE_NAME.pdf') temp = key.get

Create and download an AWS ec2 keypair using python boto

一世执手 提交于 2019-12-09 11:54:22
问题 I'm having difficulty figuring out a way (if possible) to create a new AWS keypair with the Python Boto library and then download that keypair. 回答1: The Key object returned by the create_keypair method in boto has a "save" method. So, basically you can do something like this: >>> import boto >>> ec2 = boto.connect_ec2() >>> key = ec2.create_key_pair('mynewkey') >>> key.save('/path/to/keypair/dir') If you want a more detailed example, check out https://github.com/garnaat/paws/blob/master/ec2