azure-storage-blobs

Friendly filename when public download Azure blob

情到浓时终转凉″ 提交于 2019-11-27 19:43:52
Is it possible to save a blob with a name of a GUID (or anything else) but when a user requests the files URI http://me.blob.core.windows.net/mycontainer/9BB34783-8F06-466D-AC20-37A03E504E3F the download comes down with a friendly name e.g. MyText.txt? Enabling users to download files (blobs) in Windows Azure can be done in 4 ways; Direct Download – Set the Access of the Container to Public Read Access or Full Public Read Access and expose the URL to the end user. The drawback of this method is obviously security – you have no way of controlling access when the URL is exposed. There is also no

Copy file from URL to Azure BLOB

淺唱寂寞╮ 提交于 2019-11-27 18:56:29
问题 I have a file at a remote url such as http://www.site.com/docs/doc1.xls and i would like to copy that file onto my BLOB storage account. I am aware and know of uploading files to a BLOB storage but wasn't sure how this can be done for a file from remote URL. 回答1: Try looking at CloudBlockBlob.StartCopyFromBlob that takes a URI if you are using the .NET client library. string accountName = "accountname"; string accountKey = "key"; string newFileName = "newfile2.png"; string

Is it better to have many small Azure storage blob containers (each with some blobs) or one really large container with tons of blobs?

六眼飞鱼酱① 提交于 2019-11-27 18:39:11
So the scenario is the following: I have a multiple instances of a web service that writes a blob of data to Azure Storage. I need to be able to group blobs into a container (or a virtual directory) depending on when it was received. Once in a while (every day at the worst) older blobs will get processed and then deleted. I have two options: Option 1 I make one container called "blobs" (for example) and then store all the blogs into that container. Each blob will use a directory style name with the directory name being the time it was received (e.g. "hr0min0/data.bin", "hr0min0/data2.bin",

Add Cache-Control and Expires headers to Azure Storage Blobs

戏子无情 提交于 2019-11-27 18:31:18
I'm using Azure Storage to serve up static file blobs but I'd like to add a Cache-Control and Expires header to the files/blobs when served up to reduce bandwidth costs. Application like CloudXplorer and Cerebrata's Cloud Storage Studio give options to set metadata properties on containers and blobs but get upset when trying to add Cache-Control. Anyone know if it's possible to set these headers for files? I had to run a batch job on about 600k blobs and found 2 things that really helped: Running the operation from a worker role in the same data center. The speed between Azure services is

Azure Storage API ContentDisposition

白昼怎懂夜的黑 提交于 2019-11-27 16:03:55
I see that Azure has release the ContentDisposition property of a blob: http://msdn.microsoft.com/en-us/library/windowsazure/microsoft.windowsazure.storage.blob.blobproperties.contentdisposition(v=azure.10).aspx in their version 3.0 of the api. I've set the property on my existing blobs, but when they are downloaded the content-disposition header is not included in the response. I've verified that when I FetchAttributes the properties from Azure for that Blob that the ContentDisposition property is in fact populated. It does work when using SAS, but not when downloading the file without SAS.

Getting the latest file modified from Azure Blob

北战南征 提交于 2019-11-27 16:00:39
Say I am generating a couple of json files each day in my blob storage. What I want to do is to get the latest file modified in any of my directories. So I'd have something like this in my blob: 2016/01/02/test.json 2016/01/02/test2.json 2016/02/03/test.json I want to get 2016/02/03/test.json . So one way is getting the full path of the file and do a regex checking to find the latest directory created, but this doesn't work if I have more than one josn file in each dir. Is there anything like File.GetLastWriteTime to get the latest modified file? I am using these codes to get all the files btw

Getting list of names of Azure blob files in a container?

喜欢而已 提交于 2019-11-27 15:32:20
问题 I need to list names of Azure Blob file names. Currently I m able to list all files with URL but I just need list of names. I want to avoid parsing names. Can you please see my below code and guide: CloudStorageAccount backupStorageAccount = CloudStorageAccount.Parse(blobConectionString); var backupBlobClient = backupStorageAccount.CreateCloudBlobClient(); var backupContainer = backupBlobClient.GetContainerReference(container); var list = backupContainer.ListBlobs(); 回答1: If you're using

Azure Blob 400 Bad request on Creation of container

心不动则不痛 提交于 2019-11-27 15:31:31
问题 I'm developing an ASP.Net MVC 4 app and I'm using Azure Blob to store the images that my users are going to upload. I have the following code: var storageAccount = CloudStorageAccount.Parse(ConfigurationManager.ConnectionStrings["StorageConnection"].ConnectionString); var blobStorage = storageAccount.CreateCloudBlobClient(); //merchantKey is just a GUID that is asociated with the merchant var containerName = ("ImageAds-" + merchant.merchantKey.ToString()).ToLower(); CloudBlobContainer

Reading data from Azure Blob with Spark

心不动则不痛 提交于 2019-11-27 14:11:39
I am having issue in reading data from azure blobs via spark streaming JavaDStream<String> lines = ssc.textFileStream("hdfs://ip:8020/directory"); code like above works for HDFS, but is unable to read file from Azure blob https://blobstorage.blob.core.windows.net/containerid/folder1/ Above is the path which is shown in azure UI, but this doesnt work, am i missing something, and how can we access it. I know Eventhub are ideal choice for streaming data, but my current situation demands to use storage rather then queues In order to read data from blob storage, there are two things that need to be

How to create a sub container in azure storage location

前提是你 提交于 2019-11-27 11:11:19
How to create a sub container in the azure storage location. Please let us know tobint Windows Azure doesn't provide the concept of heirarchical containers, but it does provide a mechanism to traverse heirarchy by convention and API. All containers are stored at the same level. You can gain simliar functionality by using naming conventions for your blob names. For instance, you may create a container named "content" and create blobs with the following names in that container: content/blue/images/logo.jpg content/blue/images/icon-start.jpg content/blue/images/icon-stop.jpg content/red/images