boto3

s3fs custom endpoint url

。_饼干妹妹 提交于 2020-07-09 04:04:54
问题 How do I pass a custom endpoint url to s3fs.S3FileSystem ? I've tried: kwargs = {'endpoint_url':"https://s3.wasabisys.com", 'region_name':'us-east-1'} self.client = s3fs.S3FileSystem(key=AWS_ACCESS_KEY_ID, secret=AWS_SECRET_ACCESS_KEY, use_ssl=True, **kwargs) However I get the error: File "s3fs/core.py", line 215, in connect **self.kwargs) TypeError: __init__() got an unexpected keyword argument 'endpoint_url' I've also tried passing kwargs as the parameter config_kwargs and s3_additional

s3fs custom endpoint url

我与影子孤独终老i 提交于 2020-07-09 04:03:32
问题 How do I pass a custom endpoint url to s3fs.S3FileSystem ? I've tried: kwargs = {'endpoint_url':"https://s3.wasabisys.com", 'region_name':'us-east-1'} self.client = s3fs.S3FileSystem(key=AWS_ACCESS_KEY_ID, secret=AWS_SECRET_ACCESS_KEY, use_ssl=True, **kwargs) However I get the error: File "s3fs/core.py", line 215, in connect **self.kwargs) TypeError: __init__() got an unexpected keyword argument 'endpoint_url' I've also tried passing kwargs as the parameter config_kwargs and s3_additional

Transfer file from AWS S3 to SFTP using Boto 3

不打扰是莪最后的温柔 提交于 2020-07-06 11:29:08
问题 I am a beginner in using Boto3 and I would like to transfer a file from an S3 bucket to am SFTP server directly. My final goal is to write a Python script for AWS Glue. I have found some article which shows how to transfer a file from an SFTP to an S3 bucket: https://medium.com/better-programming/transfer-file-from-ftp-server-to-a-s3-bucket-using-python-7f9e51f44e35 Unfortunately I can't find anything which does the opposite action. Do you have any suggestions/ideas? My first wrong attempt is

aws boto - how to create instance and return instance_id

这一生的挚爱 提交于 2020-06-29 11:28:20
问题 I want to create a python script where I can pass arguments/inputs to specify instance type and later attach an extra EBS (if needed). ec2 = boto3.resource('ec2','us-east-1') hddSize = input('Enter HDD Size if you want extra space ') instType = input('Enter the instance type ') def createInstance(): ec2.create_instances( ImageId=AMI, InstanceType = instType, SubnetId='subnet-31d3ad3', DisableApiTermination=True, SecurityGroupIds=['sg-sa4q36fc'], KeyName='key' ) return instanceID; ## I know

Is there a way to iterate through s3 object content using a SQL expression?

﹥>﹥吖頭↗ 提交于 2020-06-28 14:07:51
问题 I would like to iterate through each s3 bucket object and use a sql expression to find all the content that match the sql. I was able to create a python script that lists all the objects inside my bucket. import boto3 s3 = boto3.resource('s3') bucket = s3.Bucket('bucketname') startAfter = 'bucketname/directory' for obj in bucket.objects.all(): print(obj.key) I was also able to create a python script that uses a sql expression to look through the object content. import boto3 S3_BUCKET =

Is there a way to iterate through s3 object content using a SQL expression?

半城伤御伤魂 提交于 2020-06-28 14:06:49
问题 I would like to iterate through each s3 bucket object and use a sql expression to find all the content that match the sql. I was able to create a python script that lists all the objects inside my bucket. import boto3 s3 = boto3.resource('s3') bucket = s3.Bucket('bucketname') startAfter = 'bucketname/directory' for obj in bucket.objects.all(): print(obj.key) I was also able to create a python script that uses a sql expression to look through the object content. import boto3 S3_BUCKET =

Is there a way to iterate through s3 object content using a SQL expression?

白昼怎懂夜的黑 提交于 2020-06-28 14:06:29
问题 I would like to iterate through each s3 bucket object and use a sql expression to find all the content that match the sql. I was able to create a python script that lists all the objects inside my bucket. import boto3 s3 = boto3.resource('s3') bucket = s3.Bucket('bucketname') startAfter = 'bucketname/directory' for obj in bucket.objects.all(): print(obj.key) I was also able to create a python script that uses a sql expression to look through the object content. import boto3 S3_BUCKET =

Running Python Script in an existing EC2 instance on AWS

半城伤御伤魂 提交于 2020-06-28 07:09:14
问题 I have an API (in python) which has to alter files inside an EC2 instance that is already running. I'm searching on boto3 documentation, but could only find functions to start new EC2 instances, not to connect to an already existing one. I am currently thinking of replicating the APIs functions to alter the files in a script inside the EC2 instance, and having the API simply start that script on the EC2 instance by accessing it using some sort of SSH library. Would that be the correct

NoCredentialsError : Unable to locate credentials - python module boto3

£可爱£侵袭症+ 提交于 2020-06-27 09:05:38
问题 I am running django in a python virtual environment( virtualenv ). The django website is served by apache2 from an amazon ec2 instance(ubuntu 16.04). I use boto3 module to write to amazon s3. I installed awscli and ran aws configure and set up my aws access keys correctly. ( I know I configured it correctly, because $ aws s3 ls returns the correct lists of my s3 buckets.) However, when I try to write some objects to s3 from django application, it fails producing the error as described in the

How to get authenticated identity response from AWS Cognito using boto3

China☆狼群 提交于 2020-06-27 08:58:29
问题 I would like to use boto3 to get temporary credentials for access AWS services. The use case is this: A user in my Cognito User Pool logs in to my server and I want the server code to provide that user with temporary credentials to access other AWS services. I have a Cognito User Pool where my users are stored. I have a Cognito Identity Pool that does NOT allow unauthorized access, only access by users from the Cognito User Pool. So here is the code I am starting with: import boto3 client =