azure-storage-blobs

Why is ListBlobsSegmentedAsync only returning results on second page?

折月煮酒 提交于 2020-01-24 14:29:29
问题 The Problem I'm trying to grab one page of up to 5000 blobs, with no prefix. The container in question has roughly 26,000 blobs in it. I consistently get no results on my first page, but I noticed the BlobContinuationToken that's returned isn't null, so I can page again and get results on the second page. Why aren't there any results on the first page, but there are on the second? I'd like to be able to do this, and grab only one page: var response = await container.ListBlobsSegmentedAsync

Azure Blob Storage Indexer fails on images

旧时模样 提交于 2020-01-17 05:23:15
问题 I'm using Azure Search with a Blob Storage indexer. I'm seeing failures in the execution history:- [ { "key": null, "errorMessage": "Document 'https://mystorage.blob.core.windows.net/my-documents/Document/Repository/F/AD/LO/LO-min-0002-00.png' has unsupported content type 'image/png'" } ] Does this failure cause other documents (with supported content type) in the storage not to be indexed? 回答1: Yes, by default 1 failed document will stop indexing. You can increase that limit if you just have

Uploading files to Azure blob storage taking more time for larger files

一世执手 提交于 2020-01-17 04:03:21
问题 Hi All... I am trying to uploading the lager file (size more than 100 MB) files to Azure blob storage.Below is the code. My problem is even though I have used BeginPutBlock with TPL (Task Parallelism) it is taking more time (20 Min for 100 MB uploading). But i have to upload the files more than 2 GB size. Can anyone please help me on this. namespace BlobSamples { public class UploadAsync { static void Main(string[] args) { //string filePath = @"D:\Frameworks\DNCMag-Issue26-DoubleSpread.pdf";

Upload all files from local storage to Azure Blob Storage

荒凉一梦 提交于 2020-01-17 03:13:08
问题 I am currently struggling to upload multiple files from the local storage to the Azure Blob Storage, I was wondering if anyone could help me, below is the code i was previously using to upload a single zip file. private void SaveZip(string id, string fileName, string contentType, byte[] data) { // Create a blob in container and upload image bytes to it var blob = this.GetContainer().GetBlobReference(fileName); blob.Properties.ContentType = contentType; // Create some metadata for this image

Security for files when hosting static websites in Azure blob storage?

蓝咒 提交于 2020-01-16 13:03:06
问题 I have been able to host a static website in an Azure blob by using an HTTP redirect. As I describe here: I found a way to host a static website in an Azure blob, using a subdomain and an http re-direct I do the following 1) In Azure I create a storage account with a container ( called docs ) that has Blob access policy. 2) I upload my static website to the docs container using the storage explorer. This includes some PHP files in a sub folder. 3) In the DNS I set up a cname record for a

Sort order of blob list in Azure

泪湿孤枕 提交于 2020-01-15 10:44:08
问题 The List Blobs article on MSDN says that: Blobs are listed in alphabetical order in the response body, with upper-case letters listed first. However, alphabetical ordering depend on which culture they're using, or is there a precise meaning of alphabetical order that they are implicitly referring to? I want to know the exact ordering scheme. I'm guessing (and hoping) that they are using the ordinal (binary) sorting rules, equivalent to StringComparison.Ordinal in .NET. 回答1: Yes , you are

Sort order of blob list in Azure

邮差的信 提交于 2020-01-15 10:42:47
问题 The List Blobs article on MSDN says that: Blobs are listed in alphabetical order in the response body, with upper-case letters listed first. However, alphabetical ordering depend on which culture they're using, or is there a precise meaning of alphabetical order that they are implicitly referring to? I want to know the exact ordering scheme. I'm guessing (and hoping) that they are using the ordinal (binary) sorting rules, equivalent to StringComparison.Ordinal in .NET. 回答1: Yes , you are

Loading CSV from Azure Data Lake (Gen 2) to Azure SQL Database

馋奶兔 提交于 2020-01-15 06:40:24
问题 I have an Azure Data Lake Storage (Gen 2) account with several containers. I would like to import the salesorderdetail.csv file from the Sales container into an Azure SQL database. I've successfully built the same process using Azure Data Factory, but I now want to try and get this working via standard T-SQL statements only. CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'XxxxxxXX#' CREATE DATABASE SCOPED CREDENTIAL MK_Cred_Data_Load WITH IDENTITY = 'SHARED ACCESS SIGNATURE', SECRET = 'sv

How to validate if Blob is exists or not in deleted list

删除回忆录丶 提交于 2020-01-15 04:01:41
问题 Following code will able to see if blob exists or not. var blob = client.GetContainerReference(containerName).GetBlockBlobReference(blobFileName); if (blob.Exists()) How to validate if blob is exists or not in deleted list as well ? 回答1: Great question! So if a blob is deleted and if you check for its existence by calling Exists() method, it will always tell you that blob does not exist. You will get a 404 (NotFound) error if you try to fetch the attributes. However you can still find out if

Not able to see 'Lifecycle management' option for ADLS Gen2

前提是你 提交于 2020-01-14 04:34:25
问题 I have created ADLS (Azure Data Lake Storage) Gen2 resource (StorageV2 with hierarchical name space enabled). The region I created the resource in is Central US and the performance/access tier is Standard/Hot and replication is LRS. But for this resource I can't see 'Lifecycle management' option on the portal. ADLS Gen2 is simply a StorageV2 account with hierarchical namespace enabled, and since the lifecycle management option exists for StorageV2 as per Microsoft documentation , it should be