boto3

saving an image to bytes and uploading to boto3 returning content-MD5 mismatch

怎甘沉沦 提交于 2019-12-22 01:46:02
问题 I'm trying to pull an image from s3, quantize it/manipulate it, and then store it back into s3 without saving anything to disk (entirely in-memory). I was able to do it once, but upon returning to the code and trying it again it did not work. The code is as follows: import boto3 import io from PIL import Image client = boto3.client('s3',aws_access_key_id='', aws_secret_access_key='') cur_image = client.get_object(Bucket='mybucket',Key='2016-03-19 19.15.40.jpg')['Body'].read() loaded_image =

Where is the API documentation for boto3 resources?

≡放荡痞女 提交于 2019-12-21 17:22:55
问题 I have learned that boto3 offers two levels of abstraction: a low-level API called client that is a thin wrapper around the AWS HTTP API, and a high-level client called resource that offers real Python objects. My question is, where is the API documentation for the resource API? I found this: https://boto3.readthedocs.io/en/stable/reference/services/ec2.html#client But that describes the client API, and there's not a 1-to-1 mapping to the resource API. For example, enumerating instances is

Boto3 EMR - Hive step

百般思念 提交于 2019-12-21 12:46:36
问题 Is it possible to carry out hive steps using boto 3? I have been doing so using AWS CLI, but from the docs (http://boto3.readthedocs.org/en/latest/reference/services/emr.html#EMR.Client.add_job_flow_steps), it seems like only jars are accepted. If Hive steps are possible, where are the resources? Thanks 回答1: I was able to get this to work using Boto3: # First create your hive command line arguments hive_args = "hive -v -f s3://user/hadoop/hive.hql" # Split the hive args to a list hive_args

AWS: Boto3: AssumeRole example which includes role usage

ぐ巨炮叔叔 提交于 2019-12-21 07:37:39
问题 I'm trying to use the AssumeRole in such a way that i'm traversing multiple accounts and retrieving assets for those accounts. I've made it to this point: import boto3 stsclient = boto3.client('sts') assumedRoleObject = sts_client.assume_role( RoleArn="arn:aws:iam::account-of-role-to-assume:role/name-of-role", RoleSessionName="AssumeRoleSession1") Great, i have the assumedRoleObject. But now i want to use that to list things like ELBs or something that isn't a built-in low level resource. How

Any way to write files DIRECTLY to S3 using boto3?

亡梦爱人 提交于 2019-12-21 04:22:18
问题 I wrote a python script to process very large files (few TB in total), which I'll run on an EC2 instance. Afterwards, I want to store the processed files in an S3 bucket. Currently, my script first saves the data to disk and then uploads it to S3. Unfortunately, this will be quite costly given the extra time spent waiting for the instance to first write to disk and then upload. Is there any way to use boto3 to write files directly to an S3 bucket? Edit: to clarify my question, I'm asking if I

List directory contents of an S3 bucket using Python and Boto3?

与世无争的帅哥 提交于 2019-12-21 03:33:28
问题 I am trying to list all directories within an S3 bucket using Python and Boto3. I am using the following code: s3 = session.resource('s3') # I already have a boto3 Session object bucket_names = [ 'this/bucket/', 'that/bucket/' ] for name in bucket_names: bucket = s3.Bucket(name) for obj in bucket.objects.all(): # this raises an exception # handle obj When I run this I get the following exception stack trace: File "botolist.py", line 67, in <module> for obj in bucket.objects.all(): File "

Update nested map dynamodb

﹥>﹥吖頭↗ 提交于 2019-12-20 13:32:30
问题 I have a dynamodb table with an attribute containing a nested map and I would like to update a specific inventory item that is filtered via a filter expression that results in a single item from this map. How to write an update expression to update the location to "in place three" of the item with name=opel,tags include "x1" (and possibly also f3)? This should just update the first list elements location attribute. ( "inventory": [ { "location": "in place one", # I want to update this "name":

AWS: Publish SNS message for Lambda function via boto3 (Python2)

女生的网名这么多〃 提交于 2019-12-20 10:02:47
问题 I am trying to publish to an SNS topic which will then notify a Lambda function, as well as an SQS queue. My Lambda function does get called, but the CloudWatch logs state that my "event" object is None. The boto3 docs states to use the kwarg MessageStructure='json' but that throws a ClientError. Hopefully I've supplied enough information. Example Code: import json import boto3 message = {"foo": "bar"} client = boto3.client('sns') response = client.publish( TargetArn=arn, Message=json.dumps

Accessing local SQS service from another docker container using environment variables

巧了我就是萌 提交于 2019-12-20 07:10:58
问题 I have a flask application which needs to interact with an SQS service whenever an endpoint is hit. I'm mimicing the SQS service locally using docker image sukumarporeddy/sqs:fp whose base image is https://github.com/vsouza/docker-SQS-local with two more queues added in configuration. I need to access this service from another app which is run as app_service . These two services are run using docker-compose.yml file where I mentioned two services. app_service sqs_service While building the

Add tag while creating EBS snapshot using boto3

时光总嘲笑我的痴心妄想 提交于 2019-12-19 09:08:31
问题 Is it possible to add a tag when invoking the create_snapshot() method in boto3? When I run the following code: client = boto3.client('ec2') root_snap_resp = client.create_snapshot( Description='My snapshot description', VolumeId='vol-123456', Tags=[{'Key': 'Test_Key', 'Value': 'Test_Value'}] ) I get the following error: botocore.exceptions.ParamValidationError: Parameter validation failed: Unknown parameter in input: "Tags", must be one of: DryRun, VolumeId, Description Is the only way to