boto

Amazon S3 boto - how to create a folder?

梦想的初衷 提交于 2019-12-17 03:02:08
问题 How can I create a folder under a bucket using boto library for Amazon s3? I followed the manual, and created the keys with permission, metadata etc, but no where in the boto's documentation it describes how to create folders under a bucket, or create a folder under folders in bucket. 回答1: There is no concept of folders or directories in S3. You can create file names like "abc/xys/uvw/123.jpg" , which many S3 access tools like S3Fox show like a directory structure, but it's actually just a

Boto - AWS SNS how to extract topic's ARN number

戏子无情 提交于 2019-12-14 01:59:58
问题 When creating a AWS SNS topic: a = conn.create_topic(topicname) or getting the topic already created: a = conn.get_all_topics() the result is: {u'CreateTopicResponse': {u'ResponseMetadata': {u'RequestId': u'42b46710-degf-52e6-7d86-2ahc8e1c738c'}, u'CreateTopicResult': {u'TopicArn': u'arn:aws:sns:eu-west-1:467741034465:exampletopic'}}} The question is how do you get topic's ARN as string: arn:aws:sns:eu-west-1:467741034465:exampletopic ? 回答1: When you create a new topic, boto returns a Python

Difficulty in finding the Region names in AWS rds

ε祈祈猫儿з 提交于 2019-12-13 18:45:31
问题 How to get the names of all the rds instances in AWS using a boto script. I want to write a python script that fetches all the regions and then displays their dbinstances. 回答1: The following should give you all of the available regions for RDS. import boto.rds regions = boto.rds.regions() Which would return a list of RegionInfo objects like this. [RegionInfo:us-east-1, RegionInfo:cn-north-1, RegionInfo:ap-northeast-1, RegionInfo:eu-west-1, RegionInfo:ap-southeast-1, RegionInfo:ap-southeast-2,

Creating mTurk HIT from Layout with parameters using boto and python

不羁岁月 提交于 2019-12-13 13:11:14
问题 I am attempting to utilize boto to generate a HIT in mechanical turk. The goal is to use a common layout that is already generated on my mTurk account, and pass it urls of images to iteratively create HITs. The issue is that even with correctly naming the parameter if for the image urls boto is not successful. My example code to create the hit is: from boto.mturk.connection import MTurkConnection from boto.s3.connection import S3Connection from boto.mturk.layoutparam import LayoutParameter

Using boto, set content_type on files which are already present on s3

↘锁芯ラ 提交于 2019-12-13 11:43:19
问题 I'm using django storages with the s3boto backend. As per this issue, http://code.larlet.fr/django-storages/issue/5/s3botostorage-set-content-type-header-acl-fixed-use-http-and-disable-query-auth-by I have a bunch of files (all of them) that have content type 'application/octet-stream'. Given that I have an instance of <class 'boto.s3.key.Key'> , how can I set the content_type? In [29]: a.file.file.key.content_type Out[29]: 'application/octet-stream' In [30]: mimetypes.guess_type(a.file.file

aws/credentials setup and boto.exception

半腔热情 提交于 2019-12-13 08:33:27
问题 heroku does not serving the images from the aws s3 bucket i am using django as backend i already migrate my app. is heroku automatic serve all images from my pc(like in local development it serve's) or we have to again upload all the images ? heroku run python manage.py migrate and i am getting this error any idea boto.exception.S3ResponseError: S3ResponseError: 400 Bad Request my setting.py file is: """ Django settings for ecommerce project. Generated by 'django-admin startproject' using

Can't get a file from amazon s3 storage using curl with query string requestauthentication

℡╲_俬逩灬. 提交于 2019-12-13 07:15:40
问题 I'm trying to download a file from my s3 storage using simple bash script that I found on Internet. #!/bash/sh bucket='my_bucket_name' file_path='path_to_my_file' resource="/${bucket}/${file_path}" # set url time to expire expires=$(date +%s -d '4000 seconds') stringtoSign="GET\n\n\n${expires}\n${resource}" s3Key='s3Key_here' s3Secret='s3SecretKey_here' signature=`echo -en ${stringtoSign} | openssl sha1 -hmac ${s3Key} -binary | base64` curl -G https://${bucket}.s3.amazonaws.com/${file_path} \

EC2ResponseError: 401 Unauthorized using Saltstack boto_vpc module

馋奶兔 提交于 2019-12-13 06:49:36
问题 I'm trying to create a vpc using Saltstack and boto_vpc module. This is my state: vpc_create: module.run: - name: boto_vpc.create - cidr_block: '10.0.0.0/24' - vpc_name: 'myVpc' - region: 'us-east-1' - key: 'ADJJDNEJFJGNFKFKFKIW' - keyid: 'SJDJNFNEJUWLLLCLCLENNRBFLGSLSLKEMFUHE' The keys that I'm using are correct but I got this error : [INFO ] Running state [boto_vpc.create] at time 14:25:35.839797 [INFO ] Executing state module.run for boto_vpc.create [ERROR ] EC2ResponseError: 401

Getting AttributeError when trying to create DynamoDB table with global index using boto v2.25.0

无人久伴 提交于 2019-12-13 04:35:15
问题 I am trying to create a DynamoDB table with a global secondary index following the example here (The # The full, minimum-extra-calls case. block under class boto.dynamodb2.table.Table... ). I am using boto version 2.25.0. The exact code is: import boto from boto import dynamodb2 table = boto.dynamodb2.table.Table('myTable', schema=[HashKey('firstKey')], throughput={'read':5,'write':2}, global_indexes=[GlobalAllIndex('secondKeyIndex',parts=[HashKey('secondKey')],throughput={'read':5,'write':3}

Can't get max result of spot price history - US-EAST region

ぐ巨炮叔叔 提交于 2019-12-13 03:46:29
问题 When I retrieve the history price of spot for "us-east-f1" or any region in "us-east-1", the result always less than 200 price, I need for single region and single instance type. How can I retrieve a huge number of results? EX: ec2 = boto3.client('ec2') t=datetime.datetime.now() - datetime.timedelta(0) f=datetime.datetime.now() - datetime.timedelta(90) response= ec2.describe_spot_price_history(InstanceTypes =['c3.4xlarge'],ProductDescriptions = ['Linux/UNIX'], AvailabilityZone = 'us-east-1a',