Django “chunked uploads” to Amazon s3

妖精的绣舞 提交于 2021-01-29 16:48:38

问题


We're using the S3Boto3Storage to upload media files to our s3 storage on Amazon. This works pretty well. Since we're using Cloudflare as a "free" version we're limited to a maximum of 100MB per request. This is a big problem. Even the Enterprise plan is limited to 500MB.

Is there a way to use a kind of "chunked uploads" to bypass the 100MB-per-request limit?

model.py

   class Media(models.Model):
        name = models.CharField(max_length=100, null=True)
        file = models.FileField(upload_to=get_path)

storage.py

from storages.backends.s3boto3 import S3Boto3Storage

class MediaStorage(S3Boto3Storage):
    location = 'media'
    file_overwrite = False

views.py

@api_view(['POST'])
def upload_media(request):
    if request.method == 'POST':
        serializer = MediaSerializer(data=request.data)
        if serializer.is_valid():
            serializer.save()
            return Response(serializer.data)
        return Response(serializer.errors, status=status.HTTP_400_BAD_REQUEST)

回答1:


In order to bypass that limit, you'll have to use something like resumable.js on the client-side to chunk the upload in parts to send to the server via a REST call. On the sever side, you will then have to reassemble the file on the server side before you push to s3.



来源:https://stackoverflow.com/questions/53449148/django-chunked-uploads-to-amazon-s3

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!