boto3

Serve static files in Flask from private AWS S3 bucket

痞子三分冷 提交于 2020-02-23 11:48:05
问题 I am developing a Flask app running on Heroku that allows users to upload images. The app has a page displaying the user's images in a table. For developing purposes, I am saving the uploaded files to Heroku's ephemeral file system, and everything works fine: the images are correctly loaded and displayed (I am using the last method shown here implying the use of send_from_directory()). Now I have moved the storage to S3 and I am trying to adapt the code. I use boto3 to upload the files to the

Accessing Oracle from AWS Lambda in Python

為{幸葍}努か 提交于 2020-02-23 07:51:10
问题 I am writing (hopefully) a simply AWS Lambda that will do an RDS Oracle SQL SELECT and email the results. So far I have been using the Lambda Management Console, but all the examples I've run across talk about making a Lambda Deployment Package. So my first question is can I do this from the Lambda Management Console? Next question I have is what to import for the Oracle DB API? In all the examples I have seen, they download and build a package with pip, but that would then seem to imply

how to check if non-key attribute already exists in dynamodb using ConditionExpression?

回眸只為那壹抹淺笑 提交于 2020-02-21 11:08:43
问题 I want to insert into users table only if userId, email and username does not exist ( want these to be unique). userId is the primary key ( Hash key, data type - Number ). username and email are non-key attributes ( both string ). Here is how i tried: response = userTable.put_item( Item={ 'userId': userIdNext, 'accType': 0, 'username': usernameInput, 'pwd': hashedPwd, 'email': emailInput }, ConditionExpression = "(attribute_not_exists(userIdNext)) AND (NOT (contains (email, :v_email))) AND

What is the Difference between file_upload() and put_object() when uploading files to S3 using boto3

孤街醉人 提交于 2020-02-20 06:23:09
问题 I'm using boto3 and trying to upload files. It will be helpful if anyone will explain exact difference between file_upload() and put_object() s3 bucket methods in boto3 ? Is there any performance difference? Does anyone among these handles multipart upload feature in behind the scenes? What are the best use cases for both? 回答1: The upload_file method is handled by the S3 Transfer Manager, this means that it will automatically handle multipart uploads behind the scenes for you, if necessary.

Boto3: upload file from base64 to S3

只谈情不闲聊 提交于 2020-02-04 04:09:05
问题 How can I directly upload a base64 encoded file to S3 with boto3? object = s3.Object(BUCKET_NAME,email+"/"+save_name) object.put(Body=base64.b64decode(file)) I tried to upload the base64 encoded file like this, but then the file is broken. Directly uploading the string without the base64 decoding also doesn't work. Is there anything similar to set_contents_from_string() from boto2? 回答1: I just fixed the problem and found out that the way of uploading was correct, but the base64 string was

Boto3 get only S3 buckets of specific region

安稳与你 提交于 2020-02-03 10:11:46
问题 The following code sadly lists all buckets of all regions and not only from "eu-west-1" as specified. How can I change that? import boto3 s3 = boto3.client("s3", region_name="eu-west-1") for bucket in s3.list_buckets()["Buckets"]: bucket_name = bucket["Name"] print(bucket["Name"]) 回答1: s3 = boto3.client("s3", region_name="eu-west-1") connects to S3 API endpoint in eu-west-1 . It doesn't limit the listing to eu-west-1 buckets. One solution is to query the bucket location and filter. s3 = boto3

Is boto3.Bucket.upload_file blocking or non-blocking?

巧了我就是萌 提交于 2020-02-03 05:17:05
问题 Is boto3.Bucket.upload_file blocking or non-blocking? i.e. if I were to run the following bucket = session.Bucket(bucket_name) bucket.upload_file(Key=s3_key, Filename=source_path) os.remove(source_path) Do I have a race condition, depending on the size of the file? Or is upload guaranteed to complete before file deletion? 回答1: The current boto3 upload_file is blocking. As mootmoot said, you should definitely implement some error handling to be safe if you delete the file. 回答2: The fact upload

Not null gcs bucket returned “can only concatenate str (not ”bytes“) to str”

让人想犯罪 __ 提交于 2020-01-24 20:09:20
问题 I am very new to google clould storage. I am using tutorial https://cloud.google.com/storage/docs/boto-plugin to create a bucket to google cloud storage using boto. Please find code below: import boto import gcs_oauth2_boto_plugin import time GOOGLE_STORAGE = 'gs' LOCAL_FILE = 'file' CLIENT_ID = "hnsdndsjsksoasjmoadsj" CLIENT_SECRET = "jdijeroerierper-er0erjfdkdf" gcs_oauth2_boto_plugin.SetFallbackClientIdAndSecret(CLIENT_ID, CLIENT_SECRET) now = time.time() # Your project ID can be found at

Django storages S3 - Store existing file

谁说胖子不能爱 提交于 2020-01-24 14:12:48
问题 I have django 1.11 with latest django-storages, setup with S3 backend. I am trying to programatically instantiate an ImageFile, using the AWS image link as a starting point. I cannot figure out how to do this looking at the source / documentation. I assume I need to create a file, and give it the path derived from the url without the domain, but I can't find exactly how. The final aim of this is to programatically create wagtail Image objects, that point to S3 images (So pass the new

Amazon SNS - Sending SMS, delivery status

*爱你&永不变心* 提交于 2020-01-24 03:47:29
问题 I am trying to send messages using Amazon SNS but it's showing atypical behavior. It sends messages to some of the numbers while may or may not to others. import boto3 client = boto3.client('sns', .....) client.publish(PhoneNumber, Message) I am using the publish API to directly send SMS for OTPs without using Topics. Is there a way I can get the delivery status for them? Would region/DND affect the delivery. This is for Indian numbers. I am using Transactional messages for the same. 回答1: On