azure-storage-blobs

Design help for parallel processing Azure blob and bulk copy to SQL database. C#

我与影子孤独终老i 提交于 2021-02-10 05:28:05
问题 I have a requirement to fetch blob files from Azure storage, read through them, get the data and process it, and store it into a database. The number of data fetched from blob is high, i.e. around 40K records per file. There are 70 files like this in a folder. This is how I designed it: I use Parallel.Foreach on list of blob files with max parallelism 4. In each loop, I fetch stream a blob ( OpenRead method), read through it and fill a datatable. If the datatable size is 10000, I will call

How to Append a Text File in an Azure Blob with a Azure Function

谁说胖子不能爱 提交于 2021-02-09 20:33:58
问题 I've got a text file I need to append data to daily with a timer Azure Function. The text file is a comma separated file. I've created my CloudBlobClient and knew how to make my Shared Access Policy and Token. I just don't know how to use this to upload. I only know how to get an access URI from the tutorial I'm working with. 回答1: I've got a text file I need to append data to daily with a timer Azure Function. You can try to use append blob that is optimized for append operations. According

How to Append a Text File in an Azure Blob with a Azure Function

早过忘川 提交于 2021-02-09 20:28:59
问题 I've got a text file I need to append data to daily with a timer Azure Function. The text file is a comma separated file. I've created my CloudBlobClient and knew how to make my Shared Access Policy and Token. I just don't know how to use this to upload. I only know how to get an access URI from the tutorial I'm working with. 回答1: I've got a text file I need to append data to daily with a timer Azure Function. You can try to use append blob that is optimized for append operations. According

How to Append a Text File in an Azure Blob with a Azure Function

a 夏天 提交于 2021-02-09 20:28:12
问题 I've got a text file I need to append data to daily with a timer Azure Function. The text file is a comma separated file. I've created my CloudBlobClient and knew how to make my Shared Access Policy and Token. I just don't know how to use this to upload. I only know how to get an access URI from the tutorial I'm working with. 回答1: I've got a text file I need to append data to daily with a timer Azure Function. You can try to use append blob that is optimized for append operations. According

how to create folder in blob storage

有些话、适合烂在心里 提交于 2021-02-08 10:27:18
问题 I have a file such as Parent.zip and when unzipped, it will yield these files: child1.jpg , child2.txt , child3.pdf . When running Parent.zip through the function below, the files are correctly unzipped to: some-container/child1.jpg some-container/child2.txt some-container/child3.pdf How do I unzip the files to their parent folder? The desired result would be: some-container/Parent/child1.jpg some-container/Parent/child2.txt some-container/Parent/child3.pdf As you can see above the folder

Azure Blob Storage successful requests show as failed requests in Application Insights

本小妞迷上赌 提交于 2021-02-08 07:59:24
问题 The following container exists and hence returns failed request code 409 var container = blobClient.GetContainerReference("my-container"); container.CreateIfNotExists(); I do a check to make sure the Blob reference doesn't exist before creating. This returns a 404 response code with a bool. if(container.GetBlockBlobReference("this-file-could-exist").Exists()) { In the first example I expect the container to exist, in the second expect the file not to exist. But in both cases I do a check to

How to know the size of an Azure blob object via Python Azure SDK

百般思念 提交于 2021-02-08 07:33:52
问题 Following the Microsoft Azure documentation for Python developers. The azure.storage.blob.models.Blob class does have a private method called __sizeof__() . But it returns a constant value of 16, whether the blob is empty (0 byte) or 1 GB. Is there any method/attribute of a blob object with which I can dynamically check the size of the object? To be clearer, this is how my source code looks like. for i in blobService.list_blobs(container_name=container, prefix=path): if i.name.endswith('.json

How to know the size of an Azure blob object via Python Azure SDK

心不动则不痛 提交于 2021-02-08 07:33:29
问题 Following the Microsoft Azure documentation for Python developers. The azure.storage.blob.models.Blob class does have a private method called __sizeof__() . But it returns a constant value of 16, whether the blob is empty (0 byte) or 1 GB. Is there any method/attribute of a blob object with which I can dynamically check the size of the object? To be clearer, this is how my source code looks like. for i in blobService.list_blobs(container_name=container, prefix=path): if i.name.endswith('.json

Is it possible to generate SAS (Shared Access Signature) with write permission for given directory in azure blob storage

梦想与她 提交于 2021-02-08 05:16:35
问题 Our blob storage account structure: container name: simple inside this container we have blobs: aa/one.zip aa/two.zip bb/ss.zip bb/dd.zip Is it possible to generate SAS with write permission for aa “directory”, but no access for bb “directory”? With Amazon AWS we can easily create restrictions based on object/blob prefix name, but I can't find similar functionality in azure storage sdk for java. 回答1: As of today, it is not possible to do so at the folder level because as such there's no

Is it possible to generate SAS (Shared Access Signature) with write permission for given directory in azure blob storage

廉价感情. 提交于 2021-02-08 05:16:10
问题 Our blob storage account structure: container name: simple inside this container we have blobs: aa/one.zip aa/two.zip bb/ss.zip bb/dd.zip Is it possible to generate SAS with write permission for aa “directory”, but no access for bb “directory”? With Amazon AWS we can easily create restrictions based on object/blob prefix name, but I can't find similar functionality in azure storage sdk for java. 回答1: As of today, it is not possible to do so at the folder level because as such there's no