Azure Function (Python) w/ Storage Upload Trigger Fails with Large File Uploads

人走茶凉 提交于 2021-01-02 00:37:14

问题


Azure Function (Python) triggered from file uploads to Azure Storage. Function works fine for files up to ~120MB. I just load tested with a 2GB file and the function produced the error Stream was too long.

  • Where is this limitation documented?
  • How would I overcome it using Python?

Using boto3 library to PUT files to AWS S3

def main(myblob: func.InputStream):
        logging.info(f"Python blob trigger function processed blob \n"
                                 f"Name: {myblob.name}\n"
                                 f"Blob Size: {myblob.length} bytes")

        myblobBytes = myblob.read()

        fileName = pathlib.Path(myblob.name).name

        s3 = boto3.resource(
                's3',
                aws_access_key_id="youguessedit",
                aws_secret_access_key="noyoudidnt",
        )

        response=s3.Bucket(bucketName).put_object(Key="folder/" + fileName, 
        Body=myblobBytes, ContentMD5=md5Checksum)
        
        response.wait_until_exists()


回答1:


Changed boto3 from put_object to upload_fileobj and setup TransferConfig for multipart_threshold=1024*25, max_concurrency=10, multipart_chunksize=1024*25, use_threads=True.

Rips now!

Able to transfer 2GB in 89secs! Not bad.



来源:https://stackoverflow.com/questions/63849052/azure-function-python-w-storage-upload-trigger-fails-with-large-file-uploads

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!