azure-storage

Getting Maximumsize issue, while reading the Byte Array from WCF service to ASP.NET MVC 3

本小妞迷上赌 提交于 2019-12-13 02:47:10
问题 I am reading the pdf in byte array from my WCF web service and returning it to the web application to prepare the file temporary. But somehow I am getting this exception: The maximum message size quota for incoming messages (65536) has been exceeded. To increase the quota, use the MaxReceivedMessageSize property on the appropriate binding element While reading the Byte Array from WCF service. My binding tag in web application is as follows. I tried to replace the number 104857600 with

Create a file in a folder of an Azure-hosted website

北战南征 提交于 2019-12-13 02:39:31
问题 I need to copy a file from Azure Blob Storage to the "contents" folder of my Azure hosted website, and I am struggling to make this work ! Any help would be very much appreciated. The code works fine on my local server, but fails when hosted on Azure. Here is my function: public bool CopyFromAzure(string myContainer, string fileName, string filePath) { // Retrieve storage account from connection string. CloudStorageAccount storageAccount = CloudStorageAccount.Parse( ConfigurationManager

Azure Blob Lease Doesn't Release Upon Ungraceful Exit?

為{幸葍}努か 提交于 2019-12-13 01:37:47
问题 Let me preface this problem with the following: I've only tested this using the storage emulator. (SDK 1.5.) Using a quickly built console app and local storage emulator, I created a fail scenario to test how a blob lease behaves upon the application exiting ungracefully. In the Azure production version, it would be in the form of multiple web role instances accessing the single blob and locking it via leases. I've had web role instances fail out, so I figured this would be a good test

How to truncate SQL database on Microsoft Azure periodically

£可爱£侵袭症+ 提交于 2019-12-13 00:47:52
问题 I have a SQL database running on Microsoft Azure. To preventing it from getting too big, I have to truncate it periodically (e.g., a day or two). truncate table xxx is the SQL that I need to execute. So what is the easiest way to achieve this? I prefer not writing any C# code unless I have to do so. Can I use a web job which continuously running a truncate SQL statement? Or can I use a built-in functionality of SQL database on the Azure to achieve this? Thanks! 回答1: SQL Azure does not yet

About Azure Table Secondary Index

你说的曾经没有我的故事 提交于 2019-12-13 00:04:19
问题 I know the Secondary Index(s) is not here yet: It's in wish list and "planed" I like to get some ideas (or information from the reliable source) about the incoming secondary index(s) 1st question : I noticed MS planed "secondary indexes ": is that mean we can create as many indexes as we want on one table 2nd question : Current index is " PartitionKey+RowKey ", if above question is not true, will the secondary index be " RowKey+PartitionKey " or we have a good chance that we can customize it?

Microsoft.WindowsAzure.Storage update to V8.2.1.0 has broken my code

谁都会走 提交于 2019-12-12 21:31:57
问题 I've created a WebJob that places items in a queue, this process worked perfectly well until I updated Microsoft.WindowsAzure.Storage to v8.2.1.0 and I'm now getting this error 'Invalid storage account 'devstoreaccount1'. Please make sure your credentials are correct.' It was working perfectly well until the update, is this an issue? whats the fix? 回答1: According to this article, you could find: The Client Library uses a particular Storage Service version. In order to use the Storage Client

Uploading file directly from a URL in Storage Blob

百般思念 提交于 2019-12-12 20:46:19
问题 I have some large files (one of them is 10 GB), I want to store this file in Windows Azure, Storage (BLOB) directly, instead of downloading the same locally, and then uploading it. Is there a way we could just mention the URL and the same gets uploaded in the Azure Storage ? Any help would be really appreciated, if it is combination of services that also works fine :) 回答1: Yes, you can do this. Gaurav has a great post about copying from S3, but the same thing will work for any publicly

Blob storage access from Azure App Service

て烟熏妆下的殇ゞ 提交于 2019-12-12 20:39:27
问题 I have an issue with accessing blob storage from an App Service Mobile App (Not MobileService). I previously have had a MobileService running that accessed the Blob Storage in the following way: // Set the URI for the Blob Storage service. Uri blobEndpoint = new Uri(string.Format("https://{0}.blob.core.windows.net", storageAccountName)); // Create the BLOB service client. CloudBlobClient blobClient = new CloudBlobClient(blobEndpoint, new StorageCredentials(storageAccountName,

How to integration test Azure Web Jobs?

只谈情不闲聊 提交于 2019-12-12 20:25:41
问题 I have a ASP.NET Web API application with supporting Azure Web Job with functions that are triggered by messages added to a storage queue by the API's controllers. Testing the Web API is simple enough using OWIN but how do I test the web jobs? Do I run a console app in memory in the test runner? Execute the function directly (that wouldn't be a proper integration test though)? It is a continious job so the app doesn't exit. To make matters worse Azure Web Job-functions are void so there's no

Azure Table Storage - How much data am I using?

我们两清 提交于 2019-12-12 19:35:57
问题 Anyone know how i can identify how much data I'm storing in each table within each of my storage accounts? I know i can get the overall data used for all my storage accounts but I'm trying to figure how much each table is using. I don;t think Azure offers anything out of the box but how would I go about creating something to figure this out? 回答1: There are two ways by which you can fetch the size of all tables in your storage account. Option 1: Time Consuming Way Please refer to this