boto

PermanentRedirect when calling the PutObject operation

女生的网名这么多〃 提交于 2019-12-08 02:54:18
问题 The code below work locally and uploads files from a directory to S3. It's using Boto3 with Python 3. s3 = boto3.resource('s3', aws_access_key_id=AWS_ACCESS_KEY_ID, aws_secret_access_key=AWS_ACCESS_KEY_SECRET) bucket = s3.Bucket(bucket_name) uploadFileNames = [] for (sourceDir, dirname, filenames) in os.walk(sourceDir): for filename in filenames: bucket.put_object(Key=filename, Body=open("{}{}".format(sourceDir, filename), "rb")) break My problem is what when I run the same code on my

Why are my compressed files on S3 returning a 403 Forbidden error?

混江龙づ霸主 提交于 2019-12-08 02:53:34
问题 I'm using django-compressor and django-storages to serve my compressed files on S3 (using these instructions: http://django_compressor.readthedocs.org/en/latest/remote-storages/#using-staticfiles). It works great initially after running the "compress" management command, but after about one hour the compressed css and js files return a 403 Forbidden error even though I haven't made any changes to the files. I can't seem to isolate the problem, so any help would be appreciated. Here are the

Assign ansible vars based on AWS tags

♀尐吖头ヾ 提交于 2019-12-07 20:19:22
问题 I'm trying to figure out a way to assign variables in Ansible based on tags I have in AWS. I was experimenting with ec2_remote_tags but it's returning alot more information than I need. It seems like there should be an easier way to do this and I'm just not thinking of it. For example, if I have a tag called function that creates the tag_function_api group using dynamic inventory and I want to assign a variable function to the value api . Any ideas on an efficient way to do this? 回答1: I've

Boto: Dynamically get aws_access_key_id and aws_secret_access_key in Python code from config?

强颜欢笑 提交于 2019-12-07 19:44:22
问题 I have my aws_access_key_id and aws_secret_access_key stored in ~/.boto and was wondering if there was a way for me to retrieve these values in my python code using Boto as I need to insert them in to my SQL statement to to copy a CSV file from S3. 回答1: This should work: import boto access_key = boto.config.get_value('Credentials', 'aws_access_key_id') secret_key = boto.config.get_value('Credentials', 'aws_secret_access_key') 回答2: Here's a helper that will look in ~/.aws/credentials if boto

Getting Credentials File in the boto.cfg for Python

好久不见. 提交于 2019-12-07 16:23:53
问题 I'm using AWS for the first time and have just installed boto for python. I'm stuck at the step where it advices to: "You can place this file either at /etc/boto.cfg for system-wide use or in the home directory of the user executing the commands as ~/.boto." Honestly, I have no idea what to do. First, I can't find the boto.cfg and second I'm not sure which command to execute for the second option. Also, when I deploy the application to my server, I'm assuming I need to do the same thing there

How do I loop over all items in a DynamoDB table using boto?

梦想的初衷 提交于 2019-12-07 11:36:30
问题 I'd like to query a DynamoDB table and retrieve all the items and loop over them using boto. How do I structure a query or scan that returns everything in the table? 回答1: Preliminary support for the Scan API had been added to boto's layer2 for DynamoDB by Chris Moyer in commit 522e0548 ( Added scan to layer2 and Table ) and has meanwhile been updated by Mitch Garnaat in commit adeb7151 ( Cleaned up the scan method on Layer2 and Table. ) to hide the layer1 details and enable intuitive querying

Python/ Boto 3: How to retrieve/download files from AWS S3?

浪尽此生 提交于 2019-12-07 09:23:15
问题 In Python/Boto 3, Found out that to download a file individually from S3 to local can do the following: bucket = self._aws_connection.get_bucket(aws_bucketname) for s3_file in bucket.list(): if filename == s3_file.name: self._downloadFile(s3_file, local_download_directory) break; And to download all files under one chosen directory: else: bucket = self._aws_connection.get_bucket(aws_bucketname) for s3_file in bucket.list(): self._downloadFile(s3_file, local_download_directory) And helper

How can I check that a AWS S3 bucket exists?

与世无争的帅哥 提交于 2019-12-07 07:25:33
问题 Simple question here? ... How can I check with boto that a AWS bucket exists? ... preferably by providing the path? ... here is the approach I feel like taking: def bucket_exists(self, bucket_name): connection = boto.s3.connection.S3Connection('<aws access key>', '<aws secret key>') buckets = connection.get_all_buckets() for bucket in buckets: bucket_name = bucket.name # Bucket existence logic here # submit boto request ie:. exists = boto.get_bucket(bucket_name, validate=True) if exists:

Can a CloudWatch alarm be defined for a metric over many dimensions

谁说胖子不能爱 提交于 2019-12-07 07:05:10
问题 I'm using python and boto for cloudwatch metrics. I would like to be able to define an alarm for a MetricName which will be active for all the other dimensions. For instance I have a metric in the sandbox namespace with MetricName of MemoryUsage and InstanceId of i-xxx . Is it possible to define a single alarm that will be triggered for MemoryUsage for all InstanceId dimensions? 回答1: Yes, you can create an alarm for any single metric. In this case, the single metric has a dimension that

Setting specific permission in amazon s3 boto bucket

风格不统一 提交于 2019-12-07 06:30:38
问题 I have a bucket called 'ben-bucket' inside that bucket I have multiple files. I want to be able to set permissions for each file URL. I'm not too sure but I'm assuming if I wanted URL for each file inside a bucket. My URL would be like this? https://ben-bucket.s3.amazonaws.com/<file_name> So basically, I want to set a public access to that URL. How would I do it? I tried this and it doesn't work bucket = s3.Bucket('ben-bucket').Object('db.sqlite') bucket.BucketAcl('public-read') print bucket