azure-storage-blobs

Converting Azure “classic” storage accounts

我的未来我决定 提交于 2019-12-06 09:13:45
I've created some Azure Machine Learning Workspaces and associated them with "classic" storage accounts; but would like to have them associated with "not-classic" (or whatever the term is) storage accounts. Is there a way to convert the storage accounts from "classic", or to change the storage account associated with a Machine Learning Workspace? As of today, there's no automatic way of converting a "Classic" storage account into "Azure Resource Manager (ARM)" storage account. Today, you would need to copy data from a classic storage account to a new storage account. Having said that, there's

Azure PHP SDK: download container's all blob in a single zip file

和自甴很熟 提交于 2019-12-06 08:12:52
I want to download all blob from specified container as a zip file. Is there any way to download its as zip directly from Azure, no need to process it on my server? Currently I think like below: file_put_contents("file_name", get_file_contents($blob_url)); I will store all files on my server and then create zip file of those and then will force to download. . Azure has no such facility to generate a zip file for a bundle of blobs for you. Azure Storage is just... storage. You'll need to download each of your blobs via php sdk (or directly via api if you so choose). And if you want the content

Azure BlobStorage blobs to Index

心不动则不痛 提交于 2019-12-06 07:36:38
Is it possible to upload a document to a blob storage and do the following: Grab contents of document and add to index. Grab key phrases from contents in point 1 and add to index. I want the key phrases then to be searchable. I have code that can upload documents to a blobstorage which works perfect, but the only way to get this indexed(that I know of) is by using the "Import Data" within the Azure Search service, which creates and index with predefined fields - as below: This works great when only needing these fields and the index gets updated automatically every 5 min. But becomes a problem

Using asp.net WebService & Android to upload Image to Azure Blob Storage?

旧时模样 提交于 2019-12-06 06:21:17
I'm trying to upload a selected image to Azure Blob from my Android device through a asp.net WebService I've made. But I get an orange error in android: "W/System.err(454): SoapFault - faultcode: 'soap:Server' faultstring: 'Server was unable to process request. ---> Object reference not set to an instance of an object.' faultactor: 'null' detail: org.kxml2.kdom.Node@4205f358 " I'm not sure if it's my Java code or WebService witch is wrong... Here is both codes: WebService: [WebMethod] public string UploadFile(string myBase64String, string fileName) { byte[] f = Convert.FromBase64String

ImportError: No module named azure.storage.blob (when doing syncdb)

…衆ロ難τιáo~ 提交于 2019-12-06 05:42:05
问题 I recently cloned a Django project of mine in a brand new machine, and went about setting up its dependencies. One such dependency was azure storages, for which I followed the advice here and simply did sudo pip install azure . However, upon `python manage.py syncdb', I keep getting the error: ImportError: No module named azure.storage.blob I've tried to solely do sudo pip install azure-storage as well, but this doesn't alleviate my problem either. This shouldn't have been this problematic.

Connecting to Azure storage account thru proxy server Microsoft Azure Storage SDK for Java

為{幸葍}努か 提交于 2019-12-06 05:22:30
In our project we need to access the Blob Storage through a Proxy Server (squid). We are planning to use the Microsoft Azure Storage SDK for Java version 2.2.0 . But it looks like setting the proxy is not provided by the API. The only way I could make it go through the proxy is by setting the System properties System.setProperty("http.proxyHost", "127.0.0.1"); System.setProperty("http.proxyPort", "3128"); But this affect all services that are running on my JVM which harms other services that not supposed to go via the Proxy. Looking at the java code it looks like com.microsoft.azure.storage

PartitionKey was not specified in azure table storage

旧巷老猫 提交于 2019-12-06 04:42:16
问题 I am trying to load/import the data into table storage from a csv file via azure storage explorer , but I am getting the following error as An error occurred while opening the file 'D//sample.csv'.the required property 'Partitionkey' was not specified. Kindly clarify the importance of Partitionkey and Rowkey in azure table storage? 回答1: Azure Storage Key has been discussed here: Azure Table Storage Partition Key In order to understand this, you will need to know what Partitions are. Whenever

How do I upload a file to Azure blob storage from a MVC view

放肆的年华 提交于 2019-12-06 03:58:33
问题 I am coding a MVC5 internet application and would like some help to upload a file from my own filesystem to an Azure Blob. Here is my Azure upload code function: public void UploadFileToBlobStorage(string containerName, string blockBlogName, string fileName) { // Retrieve storage account from connection string. CloudStorageAccount storageAccount = CloudStorageAccount.Parse( CloudConfigurationManager.GetSetting("StorageConnectionString")); // Create the blob client. CloudBlobClient blobClient

Copy Data From Azure Blob Storage to AWS S3

霸气de小男生 提交于 2019-12-06 02:25:38
I am new to Azure Data Factory and have an interesting requirement. I need to move files from Azure Blob storage to Amazon S3, ideally using Azure Data Factory. However S3 isnt supported as a sink; https://docs.microsoft.com/en-us/azure/data-factory/copy-activity-overview I also understand from a variety of comments i've read on here that you cannot directly copy from Blob Storage to S3 - you would need to download the file locally and then upload it to S3. Does anyone know of any examples, in Data factory, SSIS or Azure Runbook that can do such a thing, I suppose an option would be to write

Data Lake Analytics U-SQL EXTRACT speed (Local vs Azure)

人盡茶涼 提交于 2019-12-06 02:08:06
Been looking into using the Azure Data Lake Analytics functionality to try and manipulate some Gzip’d xml data I have stored within Azures Blob Storage but I’m running into an interesting issue. Essentially when using U-SQL locally to process 500 of these xml files the processing time is extremely quick , roughly 40 seconds using 1 AU locally (which appears to be the limit). However when we run this same functionality from within Azure using 5 AU’s the processing takes 17+ minutes. We are eventually wanting to scale this up to ~ 20,000 files and more but have reduced the set to try and measure