boto

How can I programmatically check Amazon S3 permissions with boto?

梦想与她 提交于 2019-12-12 08:00:08
问题 We have a bushy tree in a bucket on Amazon S3 with a large number of files. I just discovered that while some files have two permissions entries, as seen if one clicks on a file in the AWS Management Console, then properties -> permissions, one line being "everyone" and the other some specific user, other files just have one entry for that user. As the results, we're having issues downloading those files to Amazon EC2 instances using boto or curl. What I need to do is go over all files in the

How to retrieve AMI platform?

心不动则不痛 提交于 2019-12-12 05:53:20
问题 How do I retrieve platform (windows, Linux/Unix) value for an instance/AMI? AWS Console shows the platform value for an AMI. I have dumped the dict property on the AMI object but it does not show the correct platform? 回答1: There is an attribute of the Image object called platform . This attribute will have a value of windows if the image is a Windows image or None if it is Linux. For example: import boto.ec2 c = boto.ec2.connect_to_region('us-east-1') images = c.get_all_images(owners='self')

Backup DynamoDB Table with dynamic columns to S3

江枫思渺然 提交于 2019-12-12 05:29:56
问题 I have read several other posts about this and in particular this question with an answer by greg about how to do it in Hive. I would like to know how to account for DynamoDB tables with variable amounts of columns though? That is, the original DynamoDB table has rows that were added dynamically with different columns. I have tried to view the exportDynamoDBToS3 script that Amazon uses in their DataPipeLine service but it has code like the following which does not seem to map the columns: --

Amazon S3 - Unable to create a datasource

雨燕双飞 提交于 2019-12-12 05:05:56
问题 I tried creating a datasource using boto for machine learning but ended up with an error. Here's my code : import boto bucketname = 'mybucket' filename = 'myfile.csv' schema = 'myfile.csv.schema' conn = boto.connect_s3() datasource = 'my_datasource' ml = boto.connect_machinelearning() #create a data source ds = ml.create_data_source_from_s3( data_source_id = datasource, data_spec ={ 'DataLocationS3':'s3://'+bucketname+'/'+filename, 'DataSchemaLocationS3':'s3://'+bucketname+'/'+schema}, data

How to dynamically select storage option for models.FileField?

谁都会走 提交于 2019-12-12 04:07:22
问题 Depending on the file extension, I want the file to be stored in a specific AWS bucket. I tried passing a function to the storage option, similar to how upload_to is dynamically defined. However, this doesn't give the desired results. In my template, when I try href to document.docfile.url, the link doesn't work. Checking in the shell, this happens Document.objects.all()[0].docfile.storage.bucket <Bucket: <function aws_bucket at 0x110672050>> Document.objects.all()[0].docfile.storage.bucket

filtering ec2 instances by associated IAM role with boto

风流意气都作罢 提交于 2019-12-12 04:02:32
问题 I have a few instances on AWS that are associated with the same IAM Role. I'm looking to write a code that returns these instances. Based from this document: http://docs.aws.amazon.com/AWSEC2/latest/APIReference/API_DescribeInstances.html, I see that there is an available filter iam-instance-profile.arn . I'm just not sure how I would go about using that or if that is what I should be using. This is an example where instances are filtered by tags. conn = boto.ec2.connect_to_region('ap

Unable to Create a CloudWatch Healthcheck via Ansible

爱⌒轻易说出口 提交于 2019-12-12 03:47:03
问题 I have a inventory file which has a RDS endpoint as : [ems_db] syd01-devops.ce4l9ofvbl4z.ap-southeast-2.rds.amazonaws.com I wrote the following play book to create a Cloudwatch ALARM : --- - name: Get instance ec2 facts debug: var=groups.ems_db[0].split('.')[0] register: ems_db_name - name: Display debug: var=ems_db_name - name: Create CPU utilization metric alarm ec2_metric_alarm: state: present region: "{{aws_region}}" name: "{{ems_db_name}}-cpu-util" metric: "CPUUtilization" namespace:

How to Launch a exact same replica of a EC2 instance in VPC from the AMI of a previous EC2 instance

寵の児 提交于 2019-12-12 03:39:20
问题 I have a EC2 instance A which must NOT be rebooted, but the problem is that it's going to be down for maintenance. I basically create a AMI of this instance using my code as follows: import boto.ec2 import time import sys conn = boto.ec2.connect_to_region("ap-southeast-1") image_id = conn.create_image(sys.argv[1], "nits", description="Testing", no_reboot=True, block_device_mapping=None, dry_run=False) image = conn.get_all_images(image_ids=[image_id])[0] while image.state != 'available': time

EC2 Instance without any attached Volume?

∥☆過路亽.° 提交于 2019-12-12 03:36:25
问题 Is it Possible to have instances without any volume(root-device or attached volume)? Lets say the instance_ids are [i-120cd3fe,i-23e46634] Is it possible that any Instances are present without any attached volume in AWS? conn=get_ec2_connection(region=region) instances = conn.get_only_instances(instance_ids=instance_ids) volumes_list=[] for instance in instances: dev_mappings = instance.block_device_mapping for block_device in dev_mappings.keys(): volume_id = dev_mappings[block_device].volume

How to combine boto with fabric

好久不见. 提交于 2019-12-12 02:22:56
问题 I need to mention. I use Windows. Now I know how to use boto. But I faced the problem that I can't run "sudo" based on boto. status, stdout, stderr = ssh_client.run('sudo python killerparser.py') The error is that sudo: sorry, you must have a tty to run sudo And then I try to run it. status, stdout, stderr = ssh_client.run('ssh -t localhost sudo python killerparser.py') But now the error becomes 'Pseudo-terminal will not be allocated because stdin is not a terminal.\r\nHost key verification