问题
I am a beginner in using Boto3 and I would like to transfer a file from an S3 bucket to am SFTP server directly.
My final goal is to write a Python script for AWS Glue.
I have found some article which shows how to transfer a file from an SFTP to an S3 bucket:
https://medium.com/better-programming/transfer-file-from-ftp-server-to-a-s3-bucket-using-python-7f9e51f44e35
Unfortunately I can't find anything which does the opposite action. Do you have any suggestions/ideas?
My first wrong attempt is below.
But I would like to avoid downloading while file to my local memory in order to move it then to SFTP.
import pysftp
import boto3
# get clients
s3_gl = boto3.client('s3', aws_access_key_id='', aws_secret_access_key='')
# parameters
bucket_gl = ''
gl_data = ''
gl_script = ''
source_response = s3_gl.get_object(Bucket=bucket_gl,Key=gl_script+'file.csv')
print(source_response['Body'].read().decode('utf-8'))
#---------------------------------
srv = pysftp.Connection(host="", username="", password="")
with srv.cd('relevant folder in sftp'):
srv.put(source_response['Body'].read().decode('utf-8'))
# Closes the connection
srv.close()
回答1:
"transfer ... directly" can mean number of different things.
Let's assume that you want to transfer the file via the local machine (where the Python code runs), without actually storing a temporary copy of the file to the local file system.
For SFTP upload, you can use Paramiko library.
Assuming you already have your Paramiko SFTPClient (sftp
) and Boto 3 client
(s3
) instances ready (what is covered in the article you have linked in your question), you can simply "glue" them together using file-like objects:
with sftp.open('/sftp/path/filename', 'wb') as f:
s3.download_fileobj('mybucket', 'mykey', f)
来源:https://stackoverflow.com/questions/58719309/transfer-file-from-aws-s3-to-sftp-using-boto-3