Uploading DataTable to Azure blob storage

喜夏-厌秋 提交于 2019-12-10 11:17:20

问题


I am trying to serialize a DataTable to XML and then upload it to Azure blob storage.

The below code works, but seems clunky and memory hungry. Is there a better way to do this? I'm especially referring to the fact that I am dumping a memory stream to a byte array and then creating a new memory stream from it.

        var container = blobClient.GetContainerReference("container");
        var blockBlob = container.GetBlockBlobReference("blob");

        byte[] blobBytes;
        using (var writeStream = new MemoryStream())
        {
            using (var writer = new StreamWriter(writeStream))
            {
                table.WriteXml(writer, XmlWriteMode.WriteSchema);

            }
            blobBytes = writeStream.ToArray();
        }
        using (var readStream = new MemoryStream(blobBytes))
        {
            blockBlob.UploadFromStream(readStream);
        }

回答1:


New answer:

I've learned of a better approach, which is to open a write stream directly to the blob. For example:

        using (var writeStream = blockBlob.OpenWrite())
        {
            using (var writer = new StreamWriter(writeStream))
            {
                table.WriteXml(writer, XmlWriteMode.WriteSchema);

            }
        }

Per our developer, this does not require the entire table to be buffered in-memory, and will probably encounter less copying around of data.

Original answer:

You can use the CloudBlockBlob.UploadFromByteArray method, and upload the byte array directly, instead of creating the second stream.

See https://msdn.microsoft.com/en-us/library/microsoft.windowsazure.storage.blob.cloudblockblob.uploadfrombytearray.aspx for the method syntax.



来源:https://stackoverflow.com/questions/37305800/uploading-datatable-to-azure-blob-storage

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!