Uploading to Azure File Storage fails with large files

丶灬走出姿态 提交于 2020-12-13 03:09:46

问题


Attempting to upload a file larger than 4MB results in a RequestBodyTooLarge exception being thrown with the following message:

The request body is too large and exceeds the maximum permissible limit.

While this limit is documenting in the REST API reference (https://docs.microsoft.com/en-us/rest/api/storageservices/put-range) it is not documented for the SDK Upload* methods (https://docs.microsoft.com/en-us/dotnet/api/azure.storage.files.shares.sharefileclient.uploadasync?view=azure-dotnet). There are also no examples of working around this.

So how to upload large files?


回答1:


After much trial and error I was able to create the following method to work around the file upload limits. In the code below _dirClient is an already initialized ShareDirectoryClient set to the folder I'm uploading to.

If the incoming stream is larger than 4MB the code reads 4MB chunks from it and uploads them until done. The HttpRange is where the bytes will be added to the file already uploaded to Azure. The index has to be incremented to point to the end of the Azure file so the new bytes will be appended.

public async Task WriteFileAsync(string filename, Stream stream) {

    //  Azure allows for 4MB max uploads  (4 x 1024 x 1024 = 4194304)
    const int uploadLimit = 4194304;

    stream.Seek(0, SeekOrigin.Begin);   // ensure stream is at the beginning
    var fileClient = await _dirClient.CreateFileAsync(filename, stream.Length);

    // If stream is below the limit upload directly
    if (stream.Length <= uploadLimit) {
        await fileClient.Value.UploadRangeAsync(new HttpRange(0, stream.Length), stream);
        return;
    }

    int bytesRead;
    long index = 0;
    byte[] buffer = new byte[uploadLimit];

    // Stream is larger than the limit so we need to upload in chunks
    while ((bytesRead = stream.Read(buffer, 0, buffer.Length)) > 0) {
        // Create a memory stream for the buffer to upload
        using MemoryStream ms = new MemoryStream(buffer, 0, bytesRead);
        await fileClient.Value.UploadRangeAsync(new HttpRange(index, ms.Length), ms);
        index += ms.Length; // increment the index to the account for bytes already written
    }
}



回答2:


If you want to upload larger files to file share or blob storage, there is an Azure Storage Data Movement Library.

It provides high-performance for uploading, downloading larger files. Please consider using this library for larger files.



来源:https://stackoverflow.com/questions/64031612/uploading-to-azure-file-storage-fails-with-large-files

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!