boto3

How to Query Images AMI's from AWS Console based on their status : Available using Python boto3?

霸气de小男生 提交于 2020-01-14 05:44:08
问题 I need to get the Details of Images AMI's from AWS Console based on their State: Available. When I tried it is getting stuck and not printing any line. Python code 1: conn = boto3.resource('ec2') image = conn.describe_images() print(image) # prints nothing for img in image: image_count.append(img) print("img count ->" + str(len(image_count))) #prints nothing Is there any exact keywords for this Image AMI's Please correct me 回答1: An important thing to realize about AMIs is that every AMI is

Accessing Meta Data from AWS S3 with AWS Lambda

牧云@^-^@ 提交于 2020-01-13 09:13:46
问题 I would like to retrieve some meta data I added (using the console x-amz-meta-my_variable) every time I upload an object to S3. I have set up lambda through the console to trigger every time an object is uploaded to my bucket I am wondering if I can use something like variable = event['Records'][0]['s3']['object']['my_variable'] to retrieve this data or if I have to connect back to S3 with the bucket and key and then call some function to retrieve it? Below is the code: from __future__ import

How to create and attach a ELB properly in Boto3

♀尐吖头ヾ 提交于 2020-01-12 11:40:10
问题 I'm new to Amazon's Boto3 API. I created a basic diagram of my sample architecture shown below, with an ELB, 4 instances, 2 subnets and and 2 target groups in 2 different Availability Zones (2 instances in each target group). I know how to create an EC2 instance, a target group, subnets, and an ELB. But what ELB functions to use, is not clear to me. How I can attach the ELB to other components? Basically, how to add instances to the ELB? I'm not sure what next steps and functions are needed

Is there a Python API for submitting batch get requests to AWS DynamoDB?

回眸只為那壹抹淺笑 提交于 2020-01-12 03:25:08
问题 The package boto3 - Amazon's official AWS API wrapper for python - has great support for uploading items to DynamoDB in bulk. It looks like this: db = boto3.resource("dynamodb", region_name = "my_region").Table("my_table") with db.batch_writer() as batch: for item in my_items: batch.put_item(Item = item) Here my_items is a list of Python dictionaries each of which must have the table's primary key(s). The situation isn't perfect - for instance, there is no safety mechanism to prevent you from

changes using boto3 for connections to aws services

╄→尐↘猪︶ㄣ 提交于 2020-01-11 14:10:11
问题 What all changes has to be done while using a function which was using boto2 earlier and how has to be changes to boto3 below is one such function example which is on boto2 and it needs to be changed to boto3 def aws(serviceName, module=boto): conn = connections.get(serviceName) if conn is None: service = getattr(module, serviceName) conn = service.connect_to_region(region) connections[serviceName] = conn return conn 回答1: That code doesn't seem to be doing much. It is simply connecting to an

Shutdown EC2 instances that do not have a certain tag using Python

半世苍凉 提交于 2020-01-11 11:36:02
问题 I'm using this script by mlapida posted here: https://gist.github.com/mlapida/1917b5db84b76b1d1d55#file-ec2-stopped-tagged-lambda-py The script by mlapida does the opposite of what I need, I'm not that familiar with Python to know how to restructure it to make this work. I need to shutdown all EC2 instances that do not have a special tag identifying them. The logic would be: 1.) Identify all running instances 2.) Strip out any instances from that list that have the special tag 3.) Process the

How to download Amazon S3 files on to local machine in folder using python and boto3?

半城伤御伤魂 提交于 2020-01-07 08:25:12
问题 I am trying to download a file from Amazon S3 to a predefined folder in the local machine. This is the code and it works fine. But when the file is saved, it saves with lastname of the path. How should I correct this? import boto3 import os S3_Object = boto3.client('s3', aws_access_key_id='##', aws_secret_access_key='##') BUCKET_NAME = '##' filename2 = [] Key2 = [] bucket = S3_Object.list_objects(Bucket=BUCKET_NAME)['Contents'] download_path = target_file_path = os.path.join('..', 'data', 'lz

Amazon AWS Kinesis Video Boto GetMedia/PutMedia

亡梦爱人 提交于 2020-01-06 05:38:25
问题 Does anybody know of a complete sample as to how to send video to a kinesis video stream, using boto3 sdk? This question was asked initially for both both GetMedia and PutMedia. Now I have got this sample code for the GetMedia part: client = boto3.client('kinesisvideo') response = client.get_data_endpoint( StreamName='my-test-stream', APIName='GET_MEDIA' ) print(response) endpoint = response.get('DataEndpoint', None) print("endpoint %s" % endpoint) if endpoint is not None: client2 = boto3

“errorMessage”: “string argument without an encoding”, [duplicate]

强颜欢笑 提交于 2020-01-06 05:26:08
问题 This question already has answers here : TypeError: string argument without an encoding (3 answers) Closed yesterday . I'm trying to save password string encrypted in DynamoDb, I get this error. Response: { "errorMessage": "string argument without an encoding", "errorType": "TypeError", "stackTrace": [ " File \"/var/task/lambda_function.py\", line 25, in lambda_handler\n encrypted_password = encrypt(session, plain_text_password, key_alias)\n", " File \"/var/task/lambda_function.py\", line 11,

Unable To Parse .csv in Python

雨燕双飞 提交于 2020-01-06 04:54:25
问题 I am doing a lab on the website LinuxAcademy.com. The Course name is Automating AWS with Lambda, Python, and Boto3 and the specific lab I am having trouble with is Lecture: Importing CSV Files into DynamoDB . In this lab we upload a .csv file into S3, an S3 event is generated in a specified bucket which then kicks off the Lambda function shown below. The function parses the .csv then uploads the contents into DynamoDB. I was originally having issues with Line 23: items = read_csv(download