amazon-s3

Is there a way to trigger lambda only after multiple files have uploaded in s3

半城伤御伤魂 提交于 2020-05-24 07:54:10
问题 A user uploads multiple files into my S3 bucket with the current day as prefix for all the files. I need to trigger a lambda function only after I have received all the files under the prefix. How can I do that?. 回答1: Create a DynamoDb table to keep track of the uploaded parts. You should use a HASH key to store the prefix of the files, or something like that. Another attribute could be a count of parts. On each part uploaded, a lambda will be called and it will update the record at the table

I would like to export DynamoDB Table to S3 bucket in CSV format using Python (Boto3)

风流意气都作罢 提交于 2020-05-24 05:00:30
问题 This question has been asked earlier in the following link: How to write dynamodb scan data's in CSV and upload to s3 bucket using python? I have amended the code as advised in the comments. The code looks like as follows: import csv import boto3 import json dynamodb = boto3.resource('dynamodb') db = dynamodb.Table('employee_details') def lambda_handler(event, context): AWS_BUCKET_NAME = 'session5cloudfront' s3 = boto3.resource('s3') bucket = s3.Bucket(AWS_BUCKET_NAME) path = '/tmp/' +

Is aws:SourceVpc condition key present in the request context when interacting with S3 over web console?

巧了我就是萌 提交于 2020-05-24 03:58:08
问题 I have a Bucket Policy (listed below) that is supposed to prevent access to an S3 bucket when accessed from anywhere other than a specific VPC. I launched an EC2 instance in the VPC, tested and confirmed that S3 access works fine. Now, when I access the same S3 bucket over web console, I get 'Error - Access Denied' message. Does this mean that aws:SourceVpc condition key is present in the request context when interacting with S3 over web console as well? My assumption is that it is present in

Is aws:SourceVpc condition key present in the request context when interacting with S3 over web console?

我与影子孤独终老i 提交于 2020-05-24 03:58:05
问题 I have a Bucket Policy (listed below) that is supposed to prevent access to an S3 bucket when accessed from anywhere other than a specific VPC. I launched an EC2 instance in the VPC, tested and confirmed that S3 access works fine. Now, when I access the same S3 bucket over web console, I get 'Error - Access Denied' message. Does this mean that aws:SourceVpc condition key is present in the request context when interacting with S3 over web console as well? My assumption is that it is present in

AWS CLI: aws sync between 2 different s3 providers

£可爱£侵袭症+ 提交于 2020-05-23 23:51:53
问题 Does anybody have a solution to sync a bucket between 2 different s3 providers? For example one is Amazon S3, second is Wasabi S3? That involves 2 different endpoints and 2 different sets of credentials. Preferable without storage data locally first, talking about 1+ million files. 回答1: This would not be possible. In Amazon S3, it is possible to copy directly between two buckets, even in different regions, because the S3 services in each region communicate with each other. This wouldn't be

How to fix upload image to s3 using Laravel

一曲冷凌霜 提交于 2020-05-22 10:00:53
问题 I try to upload an image to s3 using Laravel but I receive a runtime error. Using Laravel 5.8, PHP7 and API REST with Postman I send by body base64 I receive an image base64 and I must to upload to s3 and get the request URL. public function store(Request $request) { $s3Client = new S3Client([ 'region' => 'us-east-2', 'version' => 'latest', 'credentials' => [ 'key' => $key, 'secret' => $secret ] ]); $base64_str = substr($input['base64'], strpos($input['base64'], ",") + 1); $image = base64

How to fix upload image to s3 using Laravel

怎甘沉沦 提交于 2020-05-22 09:59:05
问题 I try to upload an image to s3 using Laravel but I receive a runtime error. Using Laravel 5.8, PHP7 and API REST with Postman I send by body base64 I receive an image base64 and I must to upload to s3 and get the request URL. public function store(Request $request) { $s3Client = new S3Client([ 'region' => 'us-east-2', 'version' => 'latest', 'credentials' => [ 'key' => $key, 'secret' => $secret ] ]); $base64_str = substr($input['base64'], strpos($input['base64'], ",") + 1); $image = base64

Python boto3 multipart upload video to aws s3

感情迁移 提交于 2020-05-17 08:49:45
问题 I want to upload a video in multiparts, I have a function that reads a video do some processing on the frame and now instead of writing it to local disk i want to send it to aws s3. I have figured out way to upload in parts using boto3 client import cv2 import boto3 multipart_upload = MultipartUpload(<AWS_BUCKET_NAME>, <AWS_KEY>) multipart_upload.open() video_path = "video.mp4" cap = cv2.VideoCapture(video_path) ok = cap.isOpened() while ok: ok,frame = cap.read() # do some processing on the

Amazon S3 Signed Url is not working with Office Web Apps Viewer (encodeURIComponent not working)

筅森魡賤 提交于 2020-05-17 07:42:07
问题 I am trying to embed 'Office Web Apps Viewer' with iframe tag to show spreadsheet preview on my website. I tried with encodeURIComponent for encoding the url but its showing "we are fetching your file" loading bar but nothing happens. thanks in advance. const originalUrl ="https://exampleDomain.amazonaws.com/Folder/Filename.xlsx?algorithm=algorithmName&credential=region&date=date&expires=time&token=encryptedToken&signature=encryptedSignature&headers=example" const encodedUrl =

How do I get images uploaded to s3 and get the link url as a response

坚强是说给别人听的谎言 提交于 2020-05-17 06:28:08
问题 Please I am looking for a real help. I followed the Image s3 upload Node.JS | WYSIWYG Javascript HTML Editor | Froala tutorial to be able to upload images to my S3 bucket but all efforts proved abortive. I could upload the images locally on my computer, and get the link url to display it right in the editor. I can't do this to S3. I am using express node.js and pug template engine. The following is my froala script file that initiate the s3Hash and the editor $.get('/s3/posts-photos' , {})