azure-storage-blobs

Azure Functions with blob storage. How to create a BlobTrigger to a sub-folder?

大城市里の小女人 提交于 2019-12-23 17:19:01
问题 I need to react to a blob that's added into a sub-folder. I know that blob storage doesn't recognize the folders, they are just virtual, but I still can't figure out how to create a blob trigger if files are added to sub-folders. Example: Excerpt from function.json : { "name": "myblob", "type": "blobTrigger", "direction": "in", "path": "rootContainer/{name}" } OK, a function is triggered and I receive the blob Excerpt from function.json : { "name": "subfolder/myblob", "type": "blobTrigger",

List directories in a container

99封情书 提交于 2019-12-23 17:05:26
问题 How can I get a list of directories in my container? I can use Get-AzureStorageBlob to get all the blobs and filter by distinct prefix /name/, but it might be slow with millions of blobs. Is there a proper way of achieving this in PowerShell? 回答1: There's no concept of directories, only containers and blobs. A blob name may have delimiters with look like directories, and may be filtered . If you choose to store millions of blobs in a container, then you'll be searching through millions of

How do you process many files from a blob storage with long paths in databricks?

房东的猫 提交于 2019-12-23 12:47:44
问题 I've enabled logging for an API Management service and the logs are being stored in a storage account. Now I'm trying to process them in an Azure Databricks workspace but I'm struggling with accessing the files. The issue seems to be that the automatically generated virtual folder structure looks like this: /insights-logs-gatewaylogs/resourceId=/SUBSCRIPTIONS/<subscription>/RESOURCEGROUPS/<resource group>/PROVIDERS/MICROSOFT.APIMANAGEMENT/SERVICE/<api service>/y=*/m=*/d=*/h=*/m=00/PT1H.json I

Connection pooling on Azure Storage

此生再无相见时 提交于 2019-12-23 09:31:53
问题 I'm starting to use Azure Storage to save files to blobs on my application. Since my application could be accessing different containers on different storages I would like to know how to implement a connection pool that will optimize resources. I want to keep the connection open to the different containers instead of opening a connection each time I try to download a blob Can anyone provide me with the best approach to achieve this? Thanks 回答1: Simple answer to your question is that you can't

What is the maximum file size which i can upload to azure blob uploadfile method

亡梦爱人 提交于 2019-12-23 07:46:37
问题 Please can i know what the maximum file size to upload to the azure storage blob using uploadfile api. 回答1: Block Blobs originally had a maximum size of 200GB (with 4MB block size), and now may be up to 4.77TB (with new 100MB block size). Maximum number of blocks per blob: 50,000. Take a look at operations on Block Blobs for more information about the REST API calls (including Put Block and Put Block List ) 回答2: Update: as of December 2016 the maximum size of a Block Blob has increased to

Azure blob storage and security best practices

时光毁灭记忆、已成空白 提交于 2019-12-23 06:39:48
问题 When exploring Azure storage I've noticed that access to a storage container is done through a shared key. There is concern where I work that if a developer is using this key for an application they're building and then leave the company that they could still login to the storage account and delete anything they want. The workaround for this would be to re-generate the secondary key for the account but then we'd have to change all of the keys in every application that uses those keys. Is it

Azure blob storage and security best practices

蓝咒 提交于 2019-12-23 06:39:46
问题 When exploring Azure storage I've noticed that access to a storage container is done through a shared key. There is concern where I work that if a developer is using this key for an application they're building and then leave the company that they could still login to the storage account and delete anything they want. The workaround for this would be to re-generate the secondary key for the account but then we'd have to change all of the keys in every application that uses those keys. Is it

Blob SAS WCF and perfomance

丶灬走出姿态 提交于 2019-12-23 06:30:10
问题 This link talks about performance and bypass the portal. To me a WCF service that authenticates is similar to a portal. A lightweight service authenticates the client as needed and then generates a SAS. Once the client receives the SAS, they can access storage account resources directly with the permissions defined by the SAS and for the interval allowed by the SAS. The SAS mitigates the need for routing all data through the front-end proxy service. The application is a thick .NET WPF client

Blob SAS WCF and perfomance

二次信任 提交于 2019-12-23 06:29:05
问题 This link talks about performance and bypass the portal. To me a WCF service that authenticates is similar to a portal. A lightweight service authenticates the client as needed and then generates a SAS. Once the client receives the SAS, they can access storage account resources directly with the permissions defined by the SAS and for the interval allowed by the SAS. The SAS mitigates the need for routing all data through the front-end proxy service. The application is a thick .NET WPF client

(Not Found) Error in Azure Mobile Services .NET Backend

假装没事ソ 提交于 2019-12-23 06:07:51
问题 Been stuck with that error till madness phases ... Please help I have created an Azure Mobile Service .NET backend, and am now trying to call its Post function from a Xamarin Android client I initialize and call the Insert async function (these are just snippets from my code) private static IMobileServiceTable<Todo> _todoMobileServiceTable; public static bool? InitializeAms() { try { CurrentPlatform.Init(); _mobileServiceClient = new MobileServiceClient(applicationUrl, applicationKey);