azure-storage-blobs

Monitoring using Azure Linux Diagnostics

不羁岁月 提交于 2019-12-12 04:58:23
问题 I am trying to enable Linux Diagnostics for individual disks attached to the VM. I am referring to this link (https://docs.microsoft.com/en-us/azure/virtual-machines/linux/diagnostic-extension) I am using this CLI azure vm extension set vmturbo DiagnosticTest LinuxDiagnostic Microsoft.Azure.Diagnostics '3.0' --private-config-path PrivateConfig.json --public-config-path PublicConfig.json -v And this is how PrivateConfig.json looks like { "storageAccountName" : “XXXXXXXXXX”,

Getting a java.lang.NoClassDefFoundError in Linux platform and not Windows

让人想犯罪 __ 提交于 2019-12-12 04:29:41
问题 I have a Java program using Microsoft azure Storage. When I run it on a Windows platform, I get no error, but when I run the same on a Linux platform, I get the following error. Caused by: java.lang.NoClassDefFoundError: Could not initialize class com.microsoft.azure.storage.core.Utility Would anybody know of any possible explanations? 回答1: It seems to be a common issue for using Maven to run a Java program on Linux. When command mvn -v or do other operations, you will get the issue like

How to copy azure blob files to azure data lake analytics

孤街醉人 提交于 2019-12-12 04:27:02
问题 Is there a way to create a job or azure service on Azure to move(cut) Azure blob files to Azure data lake store? 回答1: I would say that the Azure Data Factory is a good fit for this. It supports scheduling (https://docs.microsoft.com/en-us/azure/data-factory/data-factory-scheduling-and-execution) and it supports the transfer of data from Blob to Azure Data Lake. See this example: https://docs.microsoft.com/en-us/azure/data-factory/data-factory-azure-datalake-connector From the website: Data

Importing images Azure Machine Learning Studio

删除回忆录丶 提交于 2019-12-12 03:57:30
问题 Is it possible to import images from your Azure storage account from within a Python script module as opposed to using the Import Images module that Azure ML Studio provides. Ideally I would like to use cv2.imread() . I only want to read in grayscale data but the Import Images module reads in RGB. Can I use the BlockBlobService library as if I were calling it from an external Python script? 回答1: yes, you should be able to do that using Python. At the very least, straight REST calls should

Azure UploadDirectoryAsync doesn't overwrite existing files if changed

拥有回忆 提交于 2019-12-12 03:42:45
问题 I am doing Directory upload using Microsoft.WindowsAzure.Storage.DataMovement library as below TransferManager.Configurations.ParallelOperations = 64; UploadDirectoryOptions options = new UploadDirectoryOptions() { ContentType = "image/jpeg", Recursive = true, }; context.FileTransferred += FileTransferredCallback; context.FileFailed += FileFailedCallback; context.FileSkipped += FileSkippedCallback; await TransferManager.UploadDirectoryAsync(sourceDir, destDir, options: options, context:

Store a list<string> to SQL database table?

♀尐吖头ヾ 提交于 2019-12-12 03:37:26
问题 I want to lookup a storage account in Microsoft Azure, list the blobs in a particular container(in the storage account),then store the listed blob names to a database table. Anyone please suggest a c# code to store the list of blob names to database table. namespace ListStorageAccntFiles { class Program { static void Main(string[] args) { Console.Clear(); //Code to list the blobnames in Console CloudStorageAccount StorageAccount = CloudStorageAccount.Parse(CloudConfigurationManager.GetSetting

How to parse octet-stream files using Apache Tika?

我的梦境 提交于 2019-12-12 03:36:28
问题 I have stored all different types of files on Azure Blob storage, files are txt, doc, pdf,etc. However all the files are stored as 'octet-stream' there and when I open the files to extract the text from them using Tika, Tika cann't detect the character encoding. How can I get around this problem? FileSystem fs = FileSystem.get(new Configuration()); Path pt = new Path(Configs.BLOBSTORAGEPREFIX+fileAdd); InputStream stream = fs.open(pt); AutoDetectParser parser = new AutoDetectParser();

What are the implications of serving different file types all as application/octet-stream in a web application?

家住魔仙堡 提交于 2019-12-12 03:33:24
问题 My well-answered question here on SO has led to another question. The Azure account I mention in that original question is not managed by us. Here is an example of the headers received when requesting its blob files: HTTP/1.1 200 OK Content-MD5: R57initOyxxq6dVKtoAx3w== Content-Type: application/octet-stream Date: Wed, 02 Mar 2016 14:32:35 GMT Etag: 0x8D3180DA8EBF063 Last-Modified: Fri, 08 Jan 2016 09:25:33 GMT Server: Windows-Azure-Blob/1.0 Microsoft-HTTPAPI/2.0 x-ms-blob-type: BlockBlob x

The MAC signature found in the HTTP request is not the same as any computed signature azure integration using php

大憨熊 提交于 2019-12-12 03:29:02
问题 I am trying to list blobs using azure rest API. I am using the below code to list blobs using curl and php looks like the auth signature generated is wrong could anyone help me out in resolving the authorization issue. $date = gmdate('D, d M Y H:i:s \G\M\T'); $account_name = "xyz"; $containername = "abc"; $account_key = "asdf"; $stringtosign = "GET\n\n\n$date\n/$account_name/$containername()"; $signature = 'SharedKey'.' '.$account_name.':'.base64_encode(hash_hmac('sha256', $stringtosign,

Is stream Reading can make and send Null to blob storage

蹲街弑〆低调 提交于 2019-12-12 03:14:53
问题 I use stream for file with: memory stream, stream readinng or file stream such as byte[] buff = System.IO.File.ReadAllBytes(open.FileName); System.IO.MemoryStream ms = new System.IO.MemoryStream(buff); and I want to send it to blob storage and at that point my blob is empty, is it because of reading file by stream or it refers to other problem such as miss configuration on blob or CloudStorageAccount connection string. 回答1: Just use the below code. No need to convert memory stream, You can