azure-storage-blobs

Stream uploaded file to Azure blob storage with Node

吃可爱长大的小学妹 提交于 2019-11-30 00:37:55
Using Express with Node, I can upload a file successfully and pass it to Azure storage in the following block of code. app.get('/upload', function (req, res) { res.send( '<form action="/upload" method="post" enctype="multipart/form-data">' + '<input type="file" name="snapshot" />' + '<input type="submit" value="Upload" />' + '</form>' ); }); app.post('/upload', function (req, res) { var path = req.files.snapshot.path; var bs= azure.createBlobService(); bs.createBlockBlobFromFile('c', 'test.png', path, function (error) { }); res.send("OK"); }); This works just fine, but Express creates a

Using Parallel.Foreach in a small azure instance

江枫思渺然 提交于 2019-11-29 23:27:33
问题 I have a WebRole running on a small instance. This WebRole has a method that uploads a large amount of files to BLOB storage. According to the Azure instances specs, a small instance has only 1 core . So when uploading those blobs, will Parallel.Foreach give me any benefits over a regular Foreach ? 回答1: You would be much better served by focusing on using the aysnc versions of the blob storage APIs and/or Stream APIs so that you are I/O bound rather than CPU bound. Anywhere there is a

StorageException when downloading a large file over a slow network

心已入冬 提交于 2019-11-29 23:12:52
问题 I'm using the NuGet package WindowsAzure.Storage version 4.2.1. The following code tries to download a blob from a storage container that is in a distant datacenter. try { var blobRequestOptions = new BlobRequestOptions { RetryPolicy = new ExponentialRetry(TimeSpan.FromSeconds(5), 3), MaximumExecutionTime = TimeSpan.FromMinutes(60), ServerTimeout = TimeSpan.FromMinutes(60) }; using (var fileStream = File.Create(localPath)) { blockBlob.DownloadToStream(fileStream, null, blobRequestOptions); }

Azure, best way to store and deploy static content (e.g. images/css)?

試著忘記壹切 提交于 2019-11-29 21:07:35
We're about to deploy our .NET web application to an Azure Web Role. I'm just wondering how others have handled their static content, specifically images and css? At the moment our application package is about 25mb but 18mb of that is derived purely from images, things like navigation buttons, icons and template components that rarely ever get updated. Would it be wise to partition this out of the deployment package and move it to blob storage? I have a few doubts about this approach that I'm wondering are valid... 80% of our site runs in a HTTPS environment. Will accessing images in a blob

Handling FileContentResult when file is not found

爷,独闯天下 提交于 2019-11-29 18:28:41
问题 I have a controller action that downloads a file from an azure blob based on the container reference name (i.e. full path name of the file in the blob). The code looks something like this: public FileContentResult GetDocument(String pathName) { try { Byte[] buffer = BlobStorage.DownloadFile(pathName); FileContentResult result = new FileContentResult(buffer, "PDF"); String[] folders = pathName.Split(new char[] { '\\' }, StringSplitOptions.RemoveEmptyEntries); // get the last one as actual

Uploading blockblob and setting contenttype

荒凉一梦 提交于 2019-11-29 16:37:22
问题 I'm using Microsoft.WindowsAzure.Storage.* library from C#. This is how I'm uploading things to storage: // Store in storage CloudStorageAccount storageAccount = CloudStorageAccount.Parse("...connection string..."); CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient(); CloudBlobContainer container = blobClient.GetContainerReference("pictures"); // Create container if it doesnt exist container.CreateIfNotExists(); // Make available to everyone container.SetPermissions(new

How to delete a folder within an Azure blob container

为君一笑 提交于 2019-11-29 16:27:20
问题 I have a blob container in Azure called pictures that has various folders within it (see snapshot below): I'm trying to delete the folders titled users and uploads shown in the snapshot, but I keep the error: Failed to delete blob pictures/uploads/. Error: The specified blob does not exist. Could anyone shed light on how I can delete those two folders? I haven't been able to uncover anything meaningful via Googling this issue. Note: ask me for more information in case you need it 回答1: Windows

Azure rest api put blob

两盒软妹~` 提交于 2019-11-29 13:15:29
I am trying to put a blob with Azure rest api. i made a "GET" request successfully but i had issues with "PUT" request. When i try to make "PUT" request i get a 404error (i have seen same post in stackoverflow but it didnt help me).i am not sure if the MessageSignature that i use is correct(i have tried MessageSignaturePut but didnt work). Any suggestions? public void UploadBlobWithRestAPI(string uri, DateTime now) { string blobName = "test.txt"; string method = "PUT"; string sampleContent = "This is sample text."; int contentLength = Encoding.UTF8.GetByteCount(sampleContent); string

How to set content disposition on individual azure blob requests?

丶灬走出姿态 提交于 2019-11-29 12:44:57
I have an application that hosts videos, and we recently migrated to Azure. On our old application we gave the ability for users to either play or download the video. However on Azure it seems like I have to pick between which functionality I want, as the content disposition has to be set on the file and not on the request. So far I have came up with two very poor solutions. The first solution is streaming the download through my MVC server. CloudStorageAccount storageAccount = CloudStorageAccount.Parse(ConfigurationManager.AppSettings["StorageConnectionString"]); CloudBlobClient blobClient =

Copying storage data from one Azure account to another

僤鯓⒐⒋嵵緔 提交于 2019-11-29 11:05:17
问题 I would like to copy a very large storage container from one Azure storage account into another (which also happens to be in another subscription). I would like an opinion on the following options: Write a tool that would connect to both storage accounts and copy blobs one at a time using CloudBlob's DownloadToStream() and UploadFromStream(). This seems to be the worst option because it will incur costs when transferring the data and also be quite slow because data will have to come down to