azure-storage

Publish to Azur fails with 500 internal Server Error

旧街凉风 提交于 2020-07-21 09:27:12
问题 I have a cloud service on Windows Azure, I created a Asp.net WebAPI project and published to the cloud service, that was working fine from Visual Studio to publish before i updated visual studio to update 4 and azure SDK 2.2 to 2.6. But after updating when I publish, I got the following error messages. I tried several times, all failed. Can anyone help me? even i am not able to publish a new created project on new azure service ! 11:00:31 PM - Warning: There are package validation warnings.

Replace the Contents inside Azure Storage

好久不见. 提交于 2020-07-18 09:24:04
问题 Is there are any way to replace a file if the same name exists? I can't see any replace method in Azure Storage. Here is my code: var client = new CloudBlobClient( new Uri(" http://sweetapp.blob.core.windows.net/"), credentials); var container = client.GetContainerReference("cakepictures"); await container.CreateIfNotExistsAsync(); var perm = new BlobContainerPermissions(); perm.PublicAccess = BlobContainerPublicAccessType.Blob; await container.SetPermissionsAsync(perm); var blockBlob =

AzCopy Sync command is failing

六月ゝ 毕业季﹏ 提交于 2020-07-10 10:27:16
问题 I'm issuing this command: azcopy sync "D:\Releases\Test\MyApp" "http://server3:10000/devstoreaccount1/myapp?sv=2019-02-02&st=2020-06-24T03%3A19%3A44Z&se=2020-06-25T03%3A19%3A44Z&sr=c&sp=racwdl&sig=REDACTED" ...and I'm getting this error: error parsing the input given by the user. Failed with error Unable to infer the source 'D:\Releases\Test\MyApp' / destination 'http://server3:10000/devstoreaccount1/myapp?sv=2019-02-02&st=2020-06-24T03%3A19%3A44Z&se=2020-06-25T03%3A19%3A44Z&sr=c&sp=racwdl

How to check whether Azure Blob Storage upload was successful?

时间秒杀一切 提交于 2020-07-09 05:16:17
问题 I'm using an Azure SAS URL to upload a file to a blob storage: var blockBlob = new Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob(new System.Uri(sasUrl)); blockBlob.UploadFromFile(filePath); The file exists on my disk, and the URL should be correct since it is automatically retrieved from the Windows Store Ingestion API (and, if I slightly change one character in the URL's signature part, the upload fails with HTTP 403). However, when checking var blobs = blockBlob.Container.ListBlobs();

How to check whether Azure Blob Storage upload was successful?

≡放荡痞女 提交于 2020-07-09 05:15:11
问题 I'm using an Azure SAS URL to upload a file to a blob storage: var blockBlob = new Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob(new System.Uri(sasUrl)); blockBlob.UploadFromFile(filePath); The file exists on my disk, and the URL should be correct since it is automatically retrieved from the Windows Store Ingestion API (and, if I slightly change one character in the URL's signature part, the upload fails with HTTP 403). However, when checking var blobs = blockBlob.Container.ListBlobs();

Set-AzStorageBlobContent throws exception: Illegal characters in path

懵懂的女人 提交于 2020-07-07 05:37:17
问题 I am migrating out Azure deployment scripts from AzureRM to Az and it seems that the new module has trouble opening the files. Any ideas? I tried replacing backward slashes with forward slashes; I even called it from the folder where the scripts are, so I don't need to pass it the full file name, and it resolves it to a full name correctly, but it still can't open it. PS C:\dev\pq\service\scripts\azure\NestedTemplates> Set-AzStorageBlobContent -Container "florin-container" -Context

Execute python scripts in Azure DataFactory

邮差的信 提交于 2020-07-05 11:06:58
问题 I have my data stored in blobs and I have written a python script to do some computations and create another csv. How can I execute this in Azure Data Factory ? 回答1: Mighty. You could use Azure Data Factory V2 custom activity for your requirements. You can directly execute a command to invoke python script using Custom Activity. Please refer to this sample on the github. Hope it helps you. 回答2: Another option is using a DatabricksSparkPython Activity. This makes sense if you want to scale out

Execute python scripts in Azure DataFactory

牧云@^-^@ 提交于 2020-07-05 11:04:28
问题 I have my data stored in blobs and I have written a python script to do some computations and create another csv. How can I execute this in Azure Data Factory ? 回答1: Mighty. You could use Azure Data Factory V2 custom activity for your requirements. You can directly execute a command to invoke python script using Custom Activity. Please refer to this sample on the github. Hope it helps you. 回答2: Another option is using a DatabricksSparkPython Activity. This makes sense if you want to scale out