问题
Please can someone suggest the best way to upload/download a video blob of multiple GBs size in the fastest possible time to azure storage?
回答1:
I'm a Microsoft Technical Evangelist and I have developed a sample and free tool (no support/no guarantee) to help in these scenarios.
The binaries and source-code are available here: https://blobtransferutility.codeplex.com/
The Blob Transfer Utility is a GUI tool to upload and download thousands of small/large files to/from Windows Azure Blob Storage.
Features:
- Create batches to upload/download
- Set the Content-Type
- Transfer files in parallel
- Split large files in smaller parts that are transferred in parallel
The 4th feature is the answer to your problem.
You can learn from the sample code how I did it, or you can simply run the tool and do what you need to do.
回答2:
Best way to upload/download large blobs from Windows Azure Storage is by chunking the upload/download and make proper use of multi-threading. There're a few things you would need to consider:
- Chunk size should depend on your Internet connection. For example, if you're on a really slow Internet connection then uploading large individual chunks will almost invariably result in request timeouts.
- Number of concurrent threads to upload/download should depend on the number of processor cores on the machine from where your application code is running. In my experience, if you're running your application on a 8 core machine for best performance you could spawn 8 multiple threads where each thread is uploading/downloading part of the data. One may get tempted to run 100s of threads and leave the thread management to the OS but what I have observed is that in such cases most of the time requests are getting timed out.
- Upload/download operation should be asynchronous. You don't want your application to block/hog resources on your computer.
For uploading a large file, you could decide the chunk size (let's say it is 1 MB) and concurrent threads (let's say it is 8) and then read 8 MB from the file in an array with 8 elements and start uploading those 8 elements in parallel using upload block functionality. Once the 8 elements are uploaded, you repeat the logic to read next 8 MB and continue this process till the time all bytes are uploaded. After that you would call commit block list functionality to commit the blob in blob storage.
Similarly for downloading a large file, again you could decide the chunk size and concurrent threads and then start reading parts of the blob by specifying "range" header in Get Blob functionality. Once these chunks are downloaded, you will need to rearrange based on their actual positions (as it may happen that you get 3 - 4 MB chunk downloaded before 0 - 1 MB chunk) and start writing these chunks to a file. You would need to repeat the process till the time all bytes are downloaded.
回答3:
You can use the Microsoft's AzCopy command-line utility if you are on a Windows OS. For Linux/Mac you can use the Azure CLI.
AzCopy is a Windows command-line utility designed for copying data to and from Microsoft Azure Blob, File, and Table storage using simple commands with optimal performance.
Download a blob:
AzCopy /Source:https://myaccount.blob.core.windows.net/mycontainer /Dest:C:\myfolder /SourceKey:my_key_here /Pattern:"abc.txt"
Upload single file:
AzCopy /Source:C:\myfolder /Dest:https://myaccount.blob.core.windows.net/mycontainer /DestKey:my_key_here /Pattern:"abc.txt"
More examples and info on AzCopy here.
回答4:
You can use Cloud Combine for reliable and quick file upload to Azure blob storage. It supports multi-threaded processing, so the file gets uploaded at maximum speed.
回答5:
You can use windows azure powershell to upload/download huge files from azure.
Set-AzureStorageBlobContent is for uploading.
Set-AzureStorageBlobContent -Container containerName -File .\filename -Blob blobname
http://msdn.microsoft.com/en-us/library/dn408487.aspx
Get-AzureStorageBlobContent is for downloading.
Get-AzureStorageBlobContent -Container containername -Blob blob -Destination C:\test\
http://msdn.microsoft.com/en-us/library/dn408562.aspx
回答6:
You can use Azure Import/Export Service where you can Ship the disk with your data to Azure DataCenter.
Check this link: https://azure.microsoft.com/en-us/documentation/articles/storage-import-export-service/
回答7:
I am relatively new to the whole data migration effort, and I'm trying to maximize benefits of 'cold' storage vs 'performing' storage using a set of business rules and various BETA testing scenarios.
I have no connection to the product, but I have found that, for the money, the GoodSync product gives the best bang for the buck. Allows scheduling, file change triggers, tons of filtering options, so far all SMB and/or cloud based storage options can be analyzed and synchronized. It also allows for multi-threading. None of them are 'super fast', but GoodSync at least makes it quite manageable to run a bunch of jobs using separate tabs.
Check it out...
来源:https://stackoverflow.com/questions/14892957/best-way-to-upload-a-blob-with-a-huge-size-in-gbs-to-azure-in-the-fastest-time