boto3

Ungzipping chunks of bytes from from S3 using iter_chunks()

夙愿已清 提交于 2020-06-13 05:00:30
问题 I am encountering issues ungzipping chunks of bytes that I am reading from S3 using the iter_chunks() method from boto3 . The strategy of ungzipping the file chunk-by-chunk originates from this issue. The code is as follows: dec = zlib.decompressobj(32 + zlib.MAX_WBITS) for chunk in app.s3_client.get_object(Bucket=bucket, Key=key)["Body"].iter_chunks(2 ** 19): data = dec.decompress(chunk) print(len(chunk), len(data)) # 524288 65505 # 524288 0 # 524288 0 # ... This code initially prints out

How to mock AWS DynamoDB service?

随声附和 提交于 2020-06-12 03:39:35
问题 My service uses AWS DynamoDB as dependency. I want to write unit tests, but I don't know how to mock the DynamoDB service. Could anybody help me with that? 回答1: You can use moto python library to mock aws dynamodb, https://github.com/spulec/moto moto uses a simple system based upon python decorators, describing the AWS services. Here is an example: import unittest import boto3 from moto import mock_dynamodb2 class TestDynamo(unittest.TestCase): def setUp(self): pass @mock_dynamodb2 def test

DEVICE_PASSWORD_VERIFIER challenge response in Amazon Cognito using boto3 and warrant

浪子不回头ぞ 提交于 2020-05-27 13:09:15
问题 I'm using both the boto3 and warrant libraries to try to get a device authenticated to skip multi-factor authentication after it's been recognized. I've got through a user/password auth but can't seem to figure out the right way to authenticate the device. The following is my code: from warrant import aws_srp from warrant.aws_srp import AWSSRP import boto3 client = boto3.client('cognito-idp') import datetime username='xxx' password='xxx' client_id='xxx' aws = AWSSRP(username=username,

DEVICE_PASSWORD_VERIFIER challenge response in Amazon Cognito using boto3 and warrant

爷,独闯天下 提交于 2020-05-27 13:07:45
问题 I'm using both the boto3 and warrant libraries to try to get a device authenticated to skip multi-factor authentication after it's been recognized. I've got through a user/password auth but can't seem to figure out the right way to authenticate the device. The following is my code: from warrant import aws_srp from warrant.aws_srp import AWSSRP import boto3 client = boto3.client('cognito-idp') import datetime username='xxx' password='xxx' client_id='xxx' aws = AWSSRP(username=username,

I would like to export DynamoDB Table to S3 bucket in CSV format using Python (Boto3)

风流意气都作罢 提交于 2020-05-24 05:00:30
问题 This question has been asked earlier in the following link: How to write dynamodb scan data's in CSV and upload to s3 bucket using python? I have amended the code as advised in the comments. The code looks like as follows: import csv import boto3 import json dynamodb = boto3.resource('dynamodb') db = dynamodb.Table('employee_details') def lambda_handler(event, context): AWS_BUCKET_NAME = 'session5cloudfront' s3 = boto3.resource('s3') bucket = s3.Bucket(AWS_BUCKET_NAME) path = '/tmp/' +

Python boto3 multipart upload video to aws s3

感情迁移 提交于 2020-05-17 08:49:45
问题 I want to upload a video in multiparts, I have a function that reads a video do some processing on the frame and now instead of writing it to local disk i want to send it to aws s3. I have figured out way to upload in parts using boto3 client import cv2 import boto3 multipart_upload = MultipartUpload(<AWS_BUCKET_NAME>, <AWS_KEY>) multipart_upload.open() video_path = "video.mp4" cap = cv2.VideoCapture(video_path) ok = cap.isOpened() while ok: ok,frame = cap.read() # do some processing on the

Mock: Client does not have the attribute 'get_object'

非 Y 不嫁゛ 提交于 2020-05-17 06:27:07
问题 I am trying to patch the S3 get_object method from the boto3 module but I keep getting the following error AttributeError: <function client at 0x104570200> does not have the attribute 'get_object' This is baffling because I am able to successfully patch the boto3.client but not boto3.client.get_object , even though the boto3 documentation states that it is one of the methods for the client Here is my code import boto3 from mock import patch @pytest.mark.parametrize( 'response, expected', [

How to use asyncio to download files on s3 bucket

ぐ巨炮叔叔 提交于 2020-05-16 12:57:48
问题 I'm using the following code to download all my files in a s3 bucket: def main(bucket_name, destination_dir): bucket = boto3.resource('s3').Bucket(bucket_name) for obj in bucket.objects.all(): if obj.key.endswith('/'): continue destination = '%s/%s' % (bucket_name, obj.key) if not os.path.exists(destination): os.makedirs(os.path.dirname(destination), exist_ok=True) bucket.download_file(obj.key, destination) I would like to know how to make this asynchronous, if possible. Thank u in advance.

How to use asyncio to download files on s3 bucket

若如初见. 提交于 2020-05-16 12:57:42
问题 I'm using the following code to download all my files in a s3 bucket: def main(bucket_name, destination_dir): bucket = boto3.resource('s3').Bucket(bucket_name) for obj in bucket.objects.all(): if obj.key.endswith('/'): continue destination = '%s/%s' % (bucket_name, obj.key) if not os.path.exists(destination): os.makedirs(os.path.dirname(destination), exist_ok=True) bucket.download_file(obj.key, destination) I would like to know how to make this asynchronous, if possible. Thank u in advance.

Boto3: use 'NOT IN' for Scan in DynamoDB

拟墨画扇 提交于 2020-05-15 21:45:50
问题 I've managed to make a filter expression for filtering items from Scan. Smth like: users = [1, 2, 3] table.scan( FilterExpression=Attr('user_id').is_in(users) ) Can I somehow convert it from filtering to excluding, so I will get all users except those with ids 1, 2, 3. 回答1: The only way I have found so far is to use boto3.client instead of table and low-level syntax. Smth like this: lst_elements = '' attr_elements = {} for id in user_ids: lst_element += 'user' + str(id) attr_elements['user' +