boto

Update DynamoDB Atomic Counter with Python / Boto

北战南征 提交于 2019-12-09 09:53:59
问题 I am trying to update an atomic count counter with Python Boto 2.3.0, but can find no documentation for the operation. It seems there is no direct interface, so I tried to go to "raw" updates using the layer1 interface, but I was unable to complete even a simple update. I tried the following variations but all with no luck dynoConn.update_item(INFLUENCER_DATA_TABLE, {'HashKeyElement': "9f08b4f5-d25a-4950-a948-0381c34aed1c"}, {'new': {'Value': {'N':"1"}, 'Action': "ADD"}}) dynoConn.update_item

With boto, how can I name a newly spawned EC2 instance?

时光毁灭记忆、已成空白 提交于 2019-12-09 07:44:42
问题 I'm using boto to spawn a new EC2 instance based on an AMI. The ami.run method has a number of parameters, but none for "name" - maybe it's called something different? 回答1: import boto c = boto.connect_ec2(ec2_key, ec2_secret) image = c.get_image(ec2_ami) reservation = image.run(key_name=ec2_keypair, security_groups=ec2_secgroups, instance_type=ec2_instancetype) instance = reservation.instances[0] c.create_tags([instance.id], {"Name": instance_name}) 回答2: In EC2 there's no api to change the

How do I get the most recent Cloudwatch metric data for an instance using Boto?

情到浓时终转凉″ 提交于 2019-12-09 04:16:37
问题 I'm trying to get the most recent data for CPU utilization for an instance (actually, several instances, but just one to start with), however the following call doesn't return any data: cw = boto.cloudwatch.connect_to_region(Region) cw.get_metric_statistics( 300, datetime.datetime.now() - datetime.timedelta(seconds=600), datetime.datetime.now(), 'CPUUtilization', 'AWS/EC2', 'Average', dimensions={'InstanceId':['i-11111111']} # for stats across multiple instances: # dimensions={'InstanceId':[

Boto s3 error. BucketAlreadyOwnedByYou

瘦欲@ 提交于 2019-12-09 03:39:15
问题 Why do I get this error with s3 and boto? <Error><Code>BucketAlreadyOwnedByYou</Code><Message>Your previous request to create the named bucket succeeded and you already own it.</Message><BucketName>rtbhui</BucketName><RequestId>84115D3E9513F3C9</RequestId><HostId>+3TxrA34xHcSx0ecOD3pseRnE+LwUv3Ax1Pvp3PFoE8tHfOcn5BXyihc9V/oJx2g</HostId></Error> s3 = boto.connect_s3(parms['AWS_ACCESS_KEY_ID'], parms['AWS_SECRET_ACCESS_KEY']) bucket = s3.create_bucket(bucket_name) k = Key(bucket) #bucket is

How to change aws-ec2 instance type?

邮差的信 提交于 2019-12-08 21:00:40
I wanted to change the aws-ec2 instance type(e.g from micro to large or vice-versa etc) using Boto3. What are the factors that needs to be care while changing the instance type of ec2-instances. Here is my code: def get_ec2_boto3_connection(region, arn): sess = Boto3Connecton.get_boto3_session(arn) ec2_conn = sess.client(service_name='ec2', region_name=region) return ec2_conn def change_instance_type(arn,region): ec2_conn=get_ec2_boto3_connection(region,arn) ec2_conn.modify_instance_attribute(InstanceId=id,Attribute='instanceType'InstanceType={ 'Value': 'm4.large' }) What are the accountable

Amazon S3 Python S3Boto 403 Forbidden When Signature Has '+' sign

做~自己de王妃 提交于 2019-12-08 19:42:18
问题 I am using Django and S3Boto and whenever a signature has a '+' sign in it, I get a 403 Forbidden . If there is no '+' sign in the signature, I get the resource just fine. What could be wrong here? UPDATE: The repo is at : https://github.com/boto/boto the files concerned are: boto/utils.py boto/s3/connection.py NOTE: I am quite new to Python. I tried modifying the code but I still can't get the encoding done properly. 回答1: In a nutshell, the problem is not in S3Boto but in some call to

How to connect to S3 in python and download a csv

我是研究僧i 提交于 2019-12-08 14:08:43
问题 I want to connect to a private s3 bucket and download a csv in python. How to do this? I see a lot of comments talking about boto3, So This is what i ve tried and it is failing. from boto3.session import Session import pandas as pd import boto3 ACCESS_KEY='A' SECRET_KEY='s/' session = Session(aws_access_key_id=ACCESS_KEY, aws_secret_access_key=SECRET_KEY) s3 = session.resource('s3') obj = s3.get_object(Bucket='sp-dps', Key='da-la/hp/hp_co/current') df = pd.read_csv(obj['Body']) 回答1: import

Copying/using Python files from S3 to Amazon Elastic MapReduce at bootstrap time

孤街浪徒 提交于 2019-12-08 13:37:43
问题 I've figured out how to install python packages (numpy and such) at the bootstrapping step using boto, as well as copying files from S3 to my EC2 instances, still with boto. What I haven't figured out is how to distribute python scripts (or any file) from S3 buckets to each EMR instance using boto. Any pointers? 回答1: If you are using boto, I recommend packaging all your Python files in an archive (.tar.gz format) and then using the cacheArchive directive in Hadoop/EMR to access it. This is

Django won't serve static files from Amazon S3 with custom domain

孤街浪徒 提交于 2019-12-08 04:43:08
问题 I did setup my Django project, DNS and bucket on Amazon S3 but python manage.py collectstatic and therefore also files uploaded manually won't works. AWS S3 Settings: Bucket name: files.domain.com Bucket policy: { "Id": "Policy1483363850641", "Version": "2012-10-17", "Statement": [ { "Sid": "Stmt1483363848944", "Action": "s3:*", "Effect": "Allow", "Resource": "arn:aws:s3:::files.domain.com/*", "Principal": "*" } ] } DNS Settings: files.domain.com -> CNAME -> files.domain.com.s3.amazonaws.com

Correct way to get output of run_pty from a boto sshclient

人走茶凉 提交于 2019-12-08 03:57:48
问题 I am trying to execute a remote command on an ec2 instance that needs sudo. example code snippet conn = boto.ec2.connect_to_region(....) instance = conn.get_only_instances(instance_ids=instance_id)[0] ssh_client = sshclient_from_instance(instance, ssh_key_file='path.to.pem,user_name='ec2-user') chan = ssh_client.run_pty('sudo ls /root') Using just ssh_client.run() returns a tuple that was easy to deal with but doesn't allow sudo. run_pty is returning paramiko.channel.Channel and I can use