azure-storage-blobs

Retention Policy for Azure Containers?

若如初见. 提交于 2019-12-05 12:37:57
I'm looking to set up a policy for one of my containers so it deletes or only retains data for x days. So if x is 30, that container should only contain files that are less than 30 days old. If the files are sitting in the container for more than 30 days it should discard them. Is there any way I can configure that? Currently such kind of thing is not supported by Azure Blob Storage. You would need to write something of your own that would run periodically to do this check and delete old blobs. On a side note, this feature has been long pending (since 2011): https://feedback.azure.com/forums

Python Azure SDK: Using list_blobs to get more than 5.000 Results

爷,独闯天下 提交于 2019-12-05 09:32:09
I'm having trouble with the Python Azure SDK and haven't found anything both on Stack Overflow and in the Msdn Forums. I want to use Azure SDKs list_blobs() to get a list of blobs - there are more than 5.000 (which is the max_result). If I take a look at the code in the SDK itself then I see the following: def list_blobs(self, container_name, prefix=None, marker=None, maxresults=None, include=None, delimiter=None): The description for 'Marker' being: marker: Optional. A string value that identifies the portion of the list to be returned with the next list operation. The operation returns a

Configure CORS by using Azure Resource Manager template

落爺英雄遲暮 提交于 2019-12-05 08:14:01
I'm trying to set CORS rule for my storage account as suggested here under Configure CORS by using Azure Resource Manager tools: https://docs.microsoft.com/en-us/azure/app-service-api/app-service-api-cors-consume-javascript by adding property cors: "resources": [ { "type": "Microsoft.Storage/storageAccounts", "sku": { "name": "Standard_RAGRS", "tier": "Standard" }, "kind": "Storage", "name": "[parameters('storageAccounts_teststoragejkjk_name')]", "apiVersion": "2016-01-01", "location": "westus", "tags": {}, "properties": { "cors": {"allowedOrigins": ["*"]} }, "resources": [], "dependsOn": [] }

How to upload images using postman to azure blob storage

大憨熊 提交于 2019-12-05 07:20:42
问题 I have been trying to upload an image to my blob container folder using postman,Below is the screenshot Here is the link Authorization of Azure Storage service REST API am using to generate signature and am attaching filename file field in the body while body. var key = "[Storage account key]"; var strTime = (new Date()).toUTCString(); var strToSign = 'PUT\n\nimage/jpeg; charset=UTF-8\n\nx-ms-date:' + strTime + '\nx-ms-meta-m1:v1\nx-ms-meta-m2:v2\n/colony7/folder-customer-profilepic/Home -

Delete folder(s) inside Azure Blob storage container

雨燕双飞 提交于 2019-12-05 05:32:58
I have a container named "pictures" , and have some folders named "Folder1", "Folder2" inside of it. So files of my blob will be addressed like this " http://optimus.blob.core.windows.net/pictures/Folder1/IMG123.png " . Using the below C# code to delete the files inside folders, CloudStorageAccount storageAccount = CloudStorageAccount.Parse(*AzureConnectionString*); CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient(); CloudBlobContainer container = blobClient.GetContainerReference("pictures"); var blobs = container.ListBlobs("Folder1", true); foreach (var blob in blobs) {

Facading Azure storage blob with sFTP service

会有一股神秘感。 提交于 2019-12-05 05:28:11
We have a requirement to create large (1G-16G) row data reports compress and encrypt them. Our customers will consume those reports over sFTP. We are replacing an existing implementation so our customer should get this change transparently. Azure Blob service does not expose sFTP service so we will need some way to facade it with sFTP service. Something similar to FTP to Azure Blob Storage Bridge based on worker role. The worker role will expose sFTP endpoint to the outside world. We will setup a container per customer and limit the access from worker roles only so containers will be protected

Azure IoT Hub - Save telemetry best practice

瘦欲@ 提交于 2019-12-05 03:27:09
I am working on a IoT solution that will save weather data. I have googled for some days now on how to set up the backend. I am going to use Azure IoT Hub for handling communication, but the next step is the problem. I want to store the telemetry to a database. This is where I get confused. Some examples says that I should use Azure BLOB storage or Azure Table storage or Azure SQL. After some years of data collection I want to start creating reports of the data. So the storage needs to be good at working with big data. Next problem I am stuck on is the worker that will receive the D2C and

Azure Blob Storage: Replicating Geographically

岁酱吖の 提交于 2019-12-05 03:01:38
问题 I have a primary blob storage in West Europe region that contains user uploaded files accessed through a we application within the same region. This suffers from high latency, say if you're in East US region. So I add another instance of the application to the East US region, and use traffic manager to route between the two instances based on performance. Users in the East US region now talk to the application instance in their own region. The problem is, now the East US instance is talking

How do I save byte arrays i.e. byte[] to Azure Blob Storage?

限于喜欢 提交于 2019-12-05 00:23:16
I know how to save Streams, but I want to take that stream and create thumbnails and other sized images, but I don't know how to save a byte[] to the Azure Blob Storage. This is what I'm doing now to save the Stream: // Retrieve reference to a blob named "myblob". CloudBlockBlob _blockBlob = container.GetBlockBlobReference("SampleImage.jpg"); // upload from Stream object during file upload blockBlob.UploadFromStream(stream); // But what about pushing a byte[] array? I want to thumbnail and do some image manipulation This used to be in the Storage Client library (version 1.7 for sure) - but

Windows Azure Storage Emulator Connection String for ASP.NET MVC?

落爺英雄遲暮 提交于 2019-12-05 00:22:49
问题 I am searching for the connection string that is need to be defined to use windows azure storage emulator. So far, all the sources I have found says these connection strings should go to ServiceDefinition and ServiceConfiguration files located in Windows Azure project. However, I am not using Azure project but the ASP.NET MVC 3. For, ASP.NET MVC project, it should probably go to web.config file. However, I have no idea what it should look like ? I have Azure account if that is needed for