How to upload large files to azure container using .NET?

蹲街弑〆低调 提交于 2020-01-06 08:58:19

问题


I was trying to upload large files to an azure container using a windows form application.

Since the file size was large, I couldn't upload it as a single block.

A method to upload large files as set of blocks were figured out. I am posting the code here hoping to help someone having similar requirement.


回答1:


We can upload large files to azure container by using Block blobs.

Block blobs are comprised of blocks, each of which is identified by a block ID.

When we upload a block to a blob, it is associated with the specified block blob, but it does not become part of the blob until you commit a list of blocks that includes the new block's ID.

Block IDs are strings of equal length within a blob.

Block client code usually uses base-64 encoding to normalize strings into equal lengths. When using base-64 encoding, the pre-encoded string must be 64 bytes or less.

For more info read the documentation here.

The following code, splits the source file into multiple byte arrays of size 10MB each. Each byte array is uploaded as blocks using Put Block operation. These blocks will be associated with the specified Block blob.

Later the blockIDs are commited using Put Block List operation, which will create the blob from the uploaded blocks using the blockIDs.

public string UploadFile(string sourceFilePath)
{
    try
    {
        string storageAccountConnectionString = "AZURE_CONNECTION_STRING";
        CloudStorageAccount StorageAccount = CloudStorageAccount.Parse(storageAccountConnectionString);
        CloudBlobClient BlobClient = StorageAccount.CreateCloudBlobClient();
        CloudBlobContainer Container = BlobClient.GetContainerReference("container-name");
        Container.CreateIfNotExists();
        CloudBlockBlob blob = Container.GetBlockBlobReference( Path.GetFileName(sourceFilePath) );
        HashSet<string> blocklist = new HashSet<string>();

        byte[] fileContent = File.ReadAllBytes(sourceFilePath);
        const int pageSizeInBytes = 10485760;
        long prevLastByte = 0;
        long bytesRemain = fileContent.Length;

        do
        {
            long bytesToCopy = Math.Min(bytesRemain, pageSizeInBytes);
            byte[] bytesToSend = new byte[bytesToCopy];
            Array.Copy(fileContent, prevLastByte, bytesToSend, 0, bytesToCopy);
            prevLastByte += bytesToCopy;
            bytesRemain -= bytesToCopy;

            //create blockId
            string blockId = Guid.NewGuid().ToString();
            string base64BlockId = Convert.ToBase64String(Encoding.UTF8.GetBytes(blockId));

            blob.PutBlock(
                base64BlockId,
                new MemoryStream(bytesToSend, true),
                null
                );

            blocklist.Add(base64BlockId);

        } while (bytesRemain > 0);

        //post blocklist
        blob.PutBlockList(blocklist);

        return "Success";
    }
    catch (Exception ex)
    {
        return ex.Message;
    }
}


来源:https://stackoverflow.com/questions/54942970/how-to-upload-large-files-to-azure-container-using-net

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!