azure-blob-storage

Is there a way to create a new blob as a folder using logic apps?

霸气de小男生 提交于 2020-01-03 17:29:07
问题 I've set up a logic app to move my new files on my FTP server to my azure storage container, which has blobs for my files. I found a way to create new folders using the storage explorer, but is there a way I can automate this using logic apps? For example, if a new folder is created in my FTP and files are added to it, I want to create a blob folder and move those files into that blob. 回答1: First of all, Azure blob storage doesn't support folders . There is only your storage account and a

How to create a shared access signature with a stored access policy for an Azure Blob container in Azure Portal?

淺唱寂寞╮ 提交于 2020-01-01 02:31:06
问题 I read about shared access signatures generated with stored access policies for Azure Storage from here. I also read how to create this shared access signature with stored access policies for Azure Storage using PowerShell here. However, I want to do the above using Azure Portal. I know how to generate an ad-hoc shared access signature. I also know how to create a stored access policy for a container in my Azure Blob. How do I create a shared access signature with a stored access policy for

Azure blob metadata property EncryptionData is not set after setting encryption policy

那年仲夏 提交于 2019-12-25 03:45:27
问题 I am encrypting a blob using secret from a keyvault. Unfortunately, the EncryptionData metadata propery is not being set for the blob. It was working previously, but for some reason the property is not being set now. Can anyone help please? Please find below the code i am using to set the encryption policy. private void SetEncryptionPolicy(string containerName) { IKey cloudKey1; var secret = string.Format(ConfigurationManager.AppSettings["SecretUri"], containerName); //// Create key instances

Azure batch NodeFiles to Blob Stotage

陌路散爱 提交于 2019-12-24 16:19:46
问题 I'm running Tasks on Microsoft Azure Batch Services, where each task create a set of files on the node. I have to copy these files to a Blob Storage. The task are created and managed from a vm which is not part of the batch pool I'm able to acces the node files and i can write the content to a blob storage however this means I get the file as a string on my driving vm and upload it to the blobstorage. var container = BlobClient.GetContainerReference(containerName); container.CreateIfNotExists

when try to add dynamic file name (linked server) in azure data factory V2 error encountered

依然范特西╮ 提交于 2019-12-24 15:53:05
问题 I am new to the Azure Data factory V2 and blob storage. When try to add the file connection(linked server) in copy data from blob storage dynamically the following error is encountered while trying to map the columns by importing the schema from file "Failed to convert the value in 'container' property to 'System.String' type. Please make sure the payload structure and value are correct.." I tried: used static parameters and assign the static parameters to the linked connection 回答1: Please

FineUploader Wrong Getting Azure Blob Storage URI

允我心安 提交于 2019-12-24 08:06:32
问题 I'm trying to implement FineUploader React library into my React app to upload files to my Azure Blob Storage. Looks like for some reason, FineUploader is getting the blob storage URI wrong. This is how I instanciate a FineUploader in my test component: import React, { Component } from 'react'; import FineUploaderAzure from 'fine-uploader-wrappers/azure' import Gallery from './gallery/index'; const uploader = new FineUploaderAzure({ options: { cors: { expected: true, sendCredentials: true },

JavaScript: Azure function blob binding handling exceptions

邮差的信 提交于 2019-12-24 07:06:00
问题 Hi I have created a azure function httptrigger to read a blob from blob storage with blob input bindings. below is the function.json: { "disabled": false, "bindings": [ { "authLevel": "anonymous", "type": "httpTrigger", "direction": "in", "name": "req" }, { "name" : "blobContent", "type": "blob", "direction": "in", "path": "containerName/{id}.{extn}", "connection": "AzureWebJobsStorage" }, { "name": "$return", "type": "http", "direction": "out" } ] } and the index.js file is : module.exports

JavaScript: Azure function blob binding handling exceptions

不打扰是莪最后的温柔 提交于 2019-12-24 07:05:08
问题 Hi I have created a azure function httptrigger to read a blob from blob storage with blob input bindings. below is the function.json: { "disabled": false, "bindings": [ { "authLevel": "anonymous", "type": "httpTrigger", "direction": "in", "name": "req" }, { "name" : "blobContent", "type": "blob", "direction": "in", "path": "containerName/{id}.{extn}", "connection": "AzureWebJobsStorage" }, { "name": "$return", "type": "http", "direction": "out" } ] } and the index.js file is : module.exports

Kafka Connector for Azure Blob Storage

孤街浪徒 提交于 2019-12-24 01:18:32
问题 I need to store the messages pushed to Kafka in a deep storage. We are using Azure cloud services so I suppose Azure Blob storage could be a better option. I want to use Kafka Connect's sink connector API to push data to Azure Blob. Kafka documentation mostly suggests HDFS to export data however, in that case I need a Linux VM running Hadoop that will be costly I guess. My question is Azure Blob storage an appropriate choice to store JSON objects and building a custom sink connector is a

When was a block blob created in azure?

◇◆丶佛笑我妖孽 提交于 2019-12-23 16:01:54
问题 The blob reference contains a Properties property that has a LastModified of DateTimeOffset? . However, I can't find when was the creation date(time) of the blob. Is there a standard API I can use or I need to store that in the meta? public async Task<IBlobMeta> GetBlobMetaAsync(string blobId) { if (IsNullOrWhiteSpace(blobId)) throw new ArgumentException("Value cannot be null or whitespace.", nameof(blobId)); var blob = await EnsureGetBlobById(blobId); await blob.FetchAttributesAsync();