Complete a multipart_upload with boto3?

前端 未结 6 2292
-上瘾入骨i
-上瘾入骨i 2020-12-28 21:31

Tried this:

import boto3
from boto3.s3.transfer import TransferConfig, S3Transfer
path = \"/temp/\"
fileName = \"bigFile.gz\" # this happens to be a 5.9 Gig         


        
6条回答
  •  陌清茗
    陌清茗 (楼主)
    2020-12-28 21:37

    Your code was already correct. Indeed, a minimal example of a multipart upload just looks like this:

    import boto3
    s3 = boto3.client('s3')
    s3.upload_file('my_big_local_file.txt', 'some_bucket', 'some_key')
    

    You don't need to explicitly ask for a multipart upload, or use any of the lower-level functions in boto3 that relate to multipart uploads. Just call upload_file, and boto3 will automatically use a multipart upload if your file size is above a certain threshold (which defaults to 8MB).

    You seem to have been confused by the fact that the end result in S3 wasn't visibly made up of multiple parts:

    Result: 5.9 gig file on s3. Doesn't seem to contain multiple parts.

    ... but this is the expected outcome. The whole point of the multipart upload API is to let you upload a single file over multiple HTTP requests and end up with a single object in S3.

提交回复
热议问题