boto3

boto3 RDS get all Database names of a DB instance

耗尽温柔 提交于 2021-01-28 07:24:38
问题 I have 2 database name of MySQL Aurora for same host, username and password. Like Below {"host": "aaaa....rds.amazonaws.com" , "username": "test", "password": "test", "database": "db1" } Similarly {"host": "aaaa....rds.amazonaws.com" , "username": "test", "password": "test", "database": "db2" } But I can not find the second database name while I am trying to access rds database instances by using boto 3 as stated below rds.describe_db_instances()['DBInstances'] It only return db1 not the db2

How do you perform a credentialed download from s3 using boto3 without saving a file?

眉间皱痕 提交于 2021-01-28 04:15:38
问题 It is simple to perform a credentialed download from s3 to a file using import boto3 s3 = boto3.resource('s3') def save_file_from_s3(bucket_name, key_name, file_name): b = s3.Bucket(bucket_name) b.download_file(key_name, file_name) It is easy to download from s3 to a file-like object using import from StringIO import StringIO import urllib file_like_object = StringIO(urllib.urlopen(url).read()) (see How do I read image data from a URL in Python?) But how do you perform a credentialed download

AWS Lake Formation : grant_permissions : Unknown parameter in Resource.Table: “TableWildcard”

江枫思渺然 提交于 2021-01-28 01:09:52
问题 Trying to grant lake permissions via a Lambda Function. (Python 3.8) As far as I can see, I have my code as per documentation. Yet hitting a barrage of nonsense errors about parameters being incorrect. Could it be that I just need an optician ? Or is it some nuance or which way the Amazon wind blows today ? import boto3 import json from botocore.exceptions import ClientError def main(event,context): client = boto3.client('lakeformation') response = client.grant_permissions( Principal={

List all “Active” EMR cluster using Boto3

孤人 提交于 2021-01-27 18:52:16
问题 I'm trying to list all active clusters on EMR using boto3 but my code doesn't seem to be working it just returns null. Im trying to do this using boto3 1) list all Active EMR clusters aws emr list-clusters --active 2) List only Cluster id's and Names of the Active one's cluster names aws emr list-clusters --active --query "Clusters[*].{Name:Name}" --output text Cluster id's aws emr list-clusters --active --query "Clusters[*].{ClusterId:Id}" --output text But i'm blocked in the starting stage

Use boto3 to download from public bucket

…衆ロ難τιáo~ 提交于 2021-01-27 18:30:46
问题 I'm trying to list files from a public bucket on AWS but the best I got was list my own bucket and my own files. I'm assuming that boto3 is using my credentials configured in the system to list my things. How can I force it to list from a specific bucket, rather than my own bucket? #http://sentinel-s2-l1c.s3-website.eu-central-1.amazonaws.com/ g_bucket = "sentinel-s2-l1c" g_zone = "eu-central-1" Thank you for helping me out. 回答1: Pass the region_name when creating the client s3client = boto3

Boto intermittent “unable to load credentials” with EC2 IAM roles

£可爱£侵袭症+ 提交于 2021-01-27 14:18:26
问题 I use an Elastic Beanstalk environment for deploying a web application, and I've set up an IAM role for the instances the application will run on. Everything works flawlessy 99.99% of the time, however intermittently I will see errors in our logs with request failures showing botocore errors like the following: File "/opt/python/run/venv/local/lib/python3.6/site-packages/boto3/resources/factory.py", line 339, in property_loader self.load() File "/opt/python/run/venv/local/lib/python3.6/site

Python - creating aws lambda deployment package

五迷三道 提交于 2021-01-27 10:42:04
问题 I want to script updating code for my AWS Lambda using a Fabric task. Boto3 api expects a byte array of base-64 encoded zip file. What would be the simplest way to create it assuming I have the source code files as the input? 回答1: With the current boto3, don't unzip it, don't base64 encode it. You can do it with an open and a read like this: import boto3 c = boto3.client('lambda') c.create_function({ 'FunctionName': 'your_function', 'Handler': 'your_handler', 'Runtime': 'python3.6', 'Code': {

Read h5 file using AWS S3 s3fs/boto3

*爱你&永不变心* 提交于 2021-01-27 07:06:49
问题 I am trying to read h5 file from AWS S3. I am getting the following errors using s3fs/boto3. Can you help? Thanks! import s3fs fs = s3fs.S3FileSystem(anon=False, key='key', secret='secret') with fs.open('file', mode='rb') as f: h5 = pd.read_hdf(f) TypeError: expected str, bytes or os.PathLike object, not S3File fs = s3fs.S3FileSystem(anon=False, key='key', secret='secret') with fs.open('file', mode='rb') as f: hf = h5py.File(f) TypeError: expected str, bytes or os.PathLike object, not S3File

How to get results from a HIT on sandbox via Mturk API

烂漫一生 提交于 2021-01-27 06:51:37
问题 I have created an XML file to publish a question to MTurk and the HIT is visible in the worker sandbox. A couple of my friends even submitted responses to the HIT, but I'm unable to view the results of this HIT. Here's the code I used to publish the HIT: import boto3 MTURK_SANDBOX = 'https://mturk-requester-sandbox.us-east-1.amazonaws.com' MTURK_PROD = 'https://mturk-requester.us-east-1.amazonaws.com' mturk = boto3.client('mturk', aws_access_key_id = "blah", aws_secret_access_key = "blah",

How to get results from a HIT on sandbox via Mturk API

…衆ロ難τιáo~ 提交于 2021-01-27 06:45:50
问题 I have created an XML file to publish a question to MTurk and the HIT is visible in the worker sandbox. A couple of my friends even submitted responses to the HIT, but I'm unable to view the results of this HIT. Here's the code I used to publish the HIT: import boto3 MTURK_SANDBOX = 'https://mturk-requester-sandbox.us-east-1.amazonaws.com' MTURK_PROD = 'https://mturk-requester.us-east-1.amazonaws.com' mturk = boto3.client('mturk', aws_access_key_id = "blah", aws_secret_access_key = "blah",