azure-table-storage

Azure - Querying 200 million entities

家住魔仙堡 提交于 2019-12-04 17:06:58
I have a need to query a store of 200 million entities in Windows Azure. Ideally, I would like to use the Table Service, rather than SQL Azure, for this task. The use case is this: a POST containing a new entity will be incoming from a web-facing API. We must query about 200 million entities to determine whether or not we may accept the new entity. With the entity limit of 1,000: does this apply to this type of query, i.e. I have to query 1,000 at a time and perform my comparisons / business rules, or can I query all 200 million entities in one shot? I think I would hit a timeout in the latter

Azure Table Storage CreateQuery in .NET Core

≡放荡痞女 提交于 2019-12-04 16:11:28
问题 I'm porting my existing class library that targets .NET Framework 4.6.2 to .NET Core 1.1. Looks like some of the methods that are available in .NET Framework version are not there in .NET Core. Two such methods are table.CreateQuery and table.ExecuteQuery . Here's an existing function that's giving me an error for CreateQuery: public T Get<T>(string partitionKey, string rowKey, string tableName) where T : ITableEntity, new() => getTable(tableName).CreateQuery<T>().Where(r => r.PartitionKey ==

Using custom property types in Azure Tables with ASP.NET 5 DNX Core

孤街醉人 提交于 2019-12-04 16:05:43
Azure Table Storage does not support many property types (List<>, TimeSpan, etc). There are solutions like Lucifure Stash and Lokad.Cloud, but they are not compiling for DNX Core 5.0. Is there a way to add support for custom property types in Azure Tables with DNX Core? One solution is to use reflection to iterate through all the “custom” properties of the entity and serialize them to JSON strings. We can override TableEntity’s ReadEntity and WriteEntity methods to hook de-/serialization: using System; using System.Linq; using System.Reflection; using System.Collections.Generic; using

How to add a new column to an existing azure table storage

徘徊边缘 提交于 2019-12-04 15:25:59
问题 We are using azure table storage and have thousands of tables using the same schema. Now we are looking to add another column to these tables. How do we add another column to our existing tables without deleting the table and re-adding it? 回答1: Windows Azure Table Storage doesn't actually have columns. Each entity (e.g. a Row) is simply a set of properties, with no fixed schema. If you're using a strongly-typed class to write to your table, then you just need to add your new property to that

Windows Azure Table Services - Extended Properties and Table Schema

Deadly 提交于 2019-12-04 13:52:13
问题 I have an entity that, in addition to a few common properties, contains a list of extended properties stored as (Name, Value) pairs of strings within a collection. I should probably mention that these extended properties widely vary from instance to instance, and that they only need to be listed for each instance (there won't be any queries over the extended properties, for example finding all instances with a particular (Name, Value) pair). I'm exploring how I might persist this entity using

Storage Client Library 2.0 - Why is the API not as intuitive to use as 1.7?

我的梦境 提交于 2019-12-04 13:42:12
问题 I am migrating to using the new Storage Client Library for my Azure Table Storage. Querying with the previous Storage Client Library 1.7 namespace: var orders = serviceContext .CreateQuery<Order>(tableName) .AsTableServiceQuery<Order>() .Where(e => e.PartitionKey == partitionKey && e.RowKey == rowKey) Querying with the new Storage Client Library 2.0 classes: string partitionKeyFilter = TableQuery.GenerateFilterCondition("PartitionKey", QueryComparisons.Equal, partitionKey); string

Is it possible to have the Windows Azure emulator open the browser to a URL other than 127.0.0.1

China☆狼群 提交于 2019-12-04 11:57:45
Simple question but lots of meaning/discussion behind!!! Is it possible to have the Windows Azure emulator open the browser to a URL other than 127.0.0.1 and port 81? Follow the steps to change 127.0.0.1 to desire IP Compute Emulator Settings: Go to %Program Files%\Microsoft SDKs\Windows Azure\Emulator\devfabric Take backup of “DevFC.exe.config” so that if something goes you can revert it back. Change following settings to desired IP address range and subnet: <add key="StartIPAddress" value="192.168.0.20"/> (This can be IP address on your machine) <add key="EndIPAddress" value=”192.168.0.40" /

PartitionKey was not specified in azure table storage

浪子不回头ぞ 提交于 2019-12-04 09:39:49
I am trying to load/import the data into table storage from a csv file via azure storage explorer , but I am getting the following error as An error occurred while opening the file 'D//sample.csv'.the required property 'Partitionkey' was not specified. Kindly clarify the importance of Partitionkey and Rowkey in azure table storage? Azure Storage Key has been discussed here: Azure Table Storage Partition Key In order to understand this, you will need to know what Partitions are. Whenever you upload something to Azure Storage , it is assigned to some partition. These partitions can be either on

How to partition Azure tables used for storing logs

旧巷老猫 提交于 2019-12-04 09:29:17
问题 We have recently updated our logging to use Azure table storage, which owing to its low cost and high performance when querying by row and partition is highly suited to this purpose. We are trying to follow the guidelines given in the document Designing a Scalable Partitioning Strategy for Azure Table Storage. As we are making a great number of inserts to this table (and hopefully an increasing number, as we scale) we need to ensure that we don't hit our limits resulting in logs being lost.

Strategy for storing application logs in Azure Table Storage

感情迁移 提交于 2019-12-04 07:57:43
I am to determine a good strategy for storing logging information in Azure Table Storage. I have the following: PartitionKey: The name of the log. RowKey: Inversed DateTime ticks, The only issue here is that partitions could get very large (millions of entities) and the size will increase with time. But that being said, the type of queries being performed will always include the PartitionKey (no scanning) AND a RowKey filter (a minor scan). For example (in a natural language): where `PartitionKey` = "MyApiLogs" and where `RowKey` is between "01-01-15 12:00" and "01-01-15 13:00" Provided that