azure-storage-blobs

If using ImageResizer with Azure blobs do I need the AzureReader2 plugin?

你。 提交于 2019-12-01 06:24:00
问题 I'm working on a personal project to manage users of my club, it's hosted on the free Azure package (for now at least), partly as an experiment to try out Azure. Part of creating their records is to add a photo, so I've got a Contact Card view that lets me see who they are, when they came and a photo. I have installed ImageResizer and it's really easy to resize the 10MP photos from my camera and save them to the file system locally, but it seems that for Azure I need to use their Blobs to

Transfer file from Azure Blob Storage to Google Cloud Storage programmatically

心不动则不痛 提交于 2019-12-01 05:47:12
问题 I have a number of files that I transferred into Azure Blob Storage via the Azure Data Factory. Unfortunately, this tool doesn't appear to set the Content-MD5 value for any of the values, so when I pull that value from the Blob Storage API, it's empty. I'm aiming to transfer these files out of Azure Blob Storage and into Google Storage. The documentation I'm seeing for Google's Storagetransfer service at https://cloud.google.com/storage/transfer/reference/rest/v1/TransferSpec#HttpData

PowerShell script error: the string is missing the terminator:

戏子无情 提交于 2019-12-01 04:58:07
问题 Incredibly simple powershell script... #Server side storage copy $SourceStorageAccount = "myStorageAccount" $SourceStorageKey = "myKey" $SourceStorageContext = New-AzureStorageContext –StorageAccountName $SourceStorageAccount -StorageAccountKey $SourceStorageKey fails with the error At E:\DeploymentScripts\Storage\Test.ps1:6 char:51 + ... geContext –StorageAccountName $SourceStorageAccount -StorageAccount ... + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ The string is missing

Unable to copy blobs from one container to another

纵饮孤独 提交于 2019-12-01 04:47:01
问题 I am creating a console app that copies all blobs in all containers from an account we use for production to another we use for development. I have the following method to do this. The 'productionStorage' and 'developmentStorage' objects are in another assembly where Azure storage client methods are housed. static void CopyBlobsToDevelopment() { // Get a list of containers in production List<CloudBlobContainer> productionBlobContainers = productionStorage.GetContainerList(); // For each

ImportError: cannot import name 'BlobService' when using Azure Backend

为君一笑 提交于 2019-12-01 03:34:21
问题 I followed these instructions to set up Azure as my backend service: http://django-storages.readthedocs.io/en/latest/backends/azure.html Also added additional packages per this document: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-python-how-to-use-blob-storage Getting this error: Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/storages/backends/azure_storage.py", line 23, in <module> from azure.storage.blob.blobservice import BlobService

The MAC signature found in the HTTP request '…' is not the same as any computed signature

谁说胖子不能爱 提交于 2019-12-01 03:18:39
I'm sending the following request in Postman to retrieve a simple .jpg from Azure Blob storage at this URL https://steamo.blob.core.windows.net/testcontainer/dog.jpg GET /testcontainer/dog.jpg HTTP/1.1 Host: steamo.blob.core.windows.net Authorization: SharedKey steamo:<my access key> x-ms-date: Tue, 26 May 2015 17:35:00 GMT x-ms-version: 2014-02-14 Cache-Control: no-cache Postman-Token: b1134f8a-1a03-152c-2810-9cb351efb9ce If you're unfamiliar with Postman it is just a REST client - the Postman-Token header can probably be ignored. My access key is copied from my Azure Management Portal. I get

How to use SharedAccessSignature to access blobs

守給你的承諾、 提交于 2019-12-01 02:04:48
问题 I am trying to access a blob stored in a private container in Windows Azure. The container has a Shared Access Signature but when I try to access the blob I get a StorgeClientException "Server failed to authenticate the request. Make sure the Authorization header is formed correctly including the signature". The code that created the container and uploaded the blob looks like this: // create the container, set a Shared Access Signature, and share it // first this to do is to create the

C# retrieving a list of blobs from Azure

时光总嘲笑我的痴心妄想 提交于 2019-12-01 00:42:44
I need to have some archive cleanup code to remove old Azure logs after a certain retention period has occurred. I am aware that I can do this: CloudStorageAccount storageAccount = CloudStorageAccount.Parse(""); CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient(); CloudBlobContainer container = blobClient.GetContainerReference("ctr"); var blobList = container.ListBlobs(); foreach(var blob in blobList) { logger.Info($"Blob Name: {blob.Uri}"); } However within my container the structure is / /year/month/day/hour/files So right now there is /2017/5/11/14/files /2017/5/11/17/files

How best to convert from azure blob csv format to pandas dataframe while running notebook in azure ml

家住魔仙堡 提交于 2019-11-30 20:48:27
I have a number of large csv (tab delimited) data stored as azure blobs, and I want to create a pandas dataframe from these. I can do this locally as follows: from azure.storage.blob import BlobService import pandas as pd import os.path STORAGEACCOUNTNAME= 'account_name' STORAGEACCOUNTKEY= "key" LOCALFILENAME= 'path/to.csv' CONTAINERNAME= 'container_name' BLOBNAME= 'bloby_data/000000_0' blob_service = BlobService(account_name=STORAGEACCOUNTNAME, account_key=STORAGEACCOUNTKEY) # Only get a local copy if haven't already got it if not os.path.isfile(LOCALFILENAME): blob_service.get_blob_to_path

Does Windows Azure Blob Storage support serving compressed files similar to Amazon S3?

Deadly 提交于 2019-11-30 20:29:38
For example, at Amazon S3, there is a convention, if you have both 'bundle.js' and 'bundle.js.gz' uploaded to the server, and a client requests for 'bundle.js' file with 'Accept-Encoding: gzip' header, Amazon S3 will serve the compressed version of this file ('bundle.js.gz' instead of 'bundle.js'). Does Windows Azure Storage support this? If not, what are workarounds? Azure Storage allows you to define Content-Encoding property on a blob. For compressed content, you could set this property to be GZIP and when this content is served by a browser, it automatically decompresses the content and