azure-table-storage

Best approach to deploy tables into table storage

倾然丶 夕夏残阳落幕 提交于 2019-12-03 00:04:57
问题 Will you let me know,what is the best way to do table storage deployment as my dev team is asking like that they have many tables of which each table is having thousands of entries.Hence,they are asking me to consult any microsoft team or blog people to check the best way to do table storage deployment.Do you have idea how we can do as the scripts will be doing depleting and inserting thousand of entries everytime. do we have any delta approach like it will first check all the tables entries

Azure Tables: best practices for choosing partition/row keys

China☆狼群 提交于 2019-12-02 21:43:26
What would be best practices for choosing partition/row keys for entities in Azure Tables? The common advice is to magically balance between partition size and number of partitions. But no one seems to have a good definition of how it can be accomplished in 3 easy steps. Is there a general approach for choosing keys so that everything then just works? tomconte There is a detailed article on this very subject up on MSDN: Designing a scalable partitioning strategy for Windows Azure Storage . I've just recently been looking into the exact same thing and found this good quality article that covers

Azure Table Storage transaction limitations

这一生的挚爱 提交于 2019-12-02 19:46:06
I'm running performance tests against ATS and its behaving a bit weird when using multiple virtual machines against the same table / storage account. The entire pipeline is non blocking (await/async) and using TPL for concurrent and parallel execution. First of all its very strange that with this setup i'm only getting about 1200 insertions. This is running on a L VM box, that is 4 cores + 800mbps. I'm inserting 100.000 rows with unique PK and unique RK, that should leverage the ultimate distribution. Even more deterministic behavior is the following. When I run 1 VM i get about 1200

Code First & Identity with Azure Table Storage

断了今生、忘了曾经 提交于 2019-12-02 18:11:12
I'm working on a small web app and I've just hit the point in development where I need to start making database decisions. My original plan was to go EF Code First with MSSQL on Azure because it just simplifies the process of working with a database. However, when investigating my database hosting capabilities on Azure, I discovered Azure Table Storage which opened up the world of NoSQL to me. While the Internet is ablaze with chatter about the features of NoSQL, one of the biggest reason I have managed to gather is that NoSQL stores entire objects as one in a database without breaking up the

How to add new properties to an entity saved in Azure Table Storage?

和自甴很熟 提交于 2019-12-02 17:59:34
问题 I am working with an application where a .NET type derived from TableServiceObject being saved to Azure Table Storage (call it "Person") has a collection of another entity ("events"). Of course, the collection property doesn't save to Azure Table Storage. What I would like to do is have a variable number of properties on a Person storage entity, and create a new property whenever necessary. For example, if Person1 has attended 1 event before, and attends a new event, my code needs to go

How to execute an Azure table storage query async? client version 4.0.1

…衆ロ難τιáo~ 提交于 2019-12-02 17:50:53
Want to execute queries Async on Azure Storage Client Version 4.0.1 There is NO method ExecuteQueryAsync().. I am missing something? Should we continue to use the ExecuteQuerySegmentedAsync still? Thanks. I end up making an extension method to use ExecuteQuerySegmentedAsync. I am not sure whether this solution is optimal, if anybody has any comment please don’t hesitate. public static async Task<IList<T>> ExecuteQueryAsync<T>(this CloudTable table, TableQuery<T> query, CancellationToken ct = default(CancellationToken), Action<IList<T>> onProgress = null) where T : ITableEntity, new() { var

Painfully slow Azure table insert and delete batch operations

自古美人都是妖i 提交于 2019-12-02 14:29:39
I am running into a huge performance bottleneck when using Azure table storage. My desire is to use tables as a sort of cache, so a long process may result in anywhere from hundreds to several thousand rows of data. The data can then be quickly queried by partition and row keys. The querying is working pretty fast (extremely fast when only using partition and row keys, a bit slower, but still acceptable when also searching through properties for a particular match). However, both inserting and deleting rows is painfully slow. Clarification I want to clarify that even inserting a single batch

EntityFramework 7 with Azure Table Storage provider code samples

南楼画角 提交于 2019-12-02 11:23:55
问题 Looking for some code samples of EF7 with Azure Table Storage provider 回答1: The Azure Table Storage provider was a prototype and will not be supported by EF Core until after 1.0. See https://github.com/aspnet/EntityFramework/issues/1142 Also note, the ATS provider was removed from the EF Core project in 2014. It has not been updated since. 来源: https://stackoverflow.com/questions/34943317/entityframework-7-with-azure-table-storage-provider-code-samples

How to add new properties to an entity saved in Azure Table Storage?

混江龙づ霸主 提交于 2019-12-02 10:46:53
I am working with an application where a .NET type derived from TableServiceObject being saved to Azure Table Storage (call it "Person") has a collection of another entity ("events"). Of course, the collection property doesn't save to Azure Table Storage. What I would like to do is have a variable number of properties on a Person storage entity, and create a new property whenever necessary. For example, if Person1 has attended 1 event before, and attends a new event, my code needs to go create an "Event2" property for the Person1 entity, and save the pointer to the storage location of Event2

Azure diagnostics and WadLogsTable

眉间皱痕 提交于 2019-12-02 07:47:01
I deployed an application on Windows Azure, i activated the diagnostic monitor like follows : public override bool OnStart() { CloudStorageAccount account = CloudStorageAccount.Parse("DefaultEndpointsProtocol=https;AccountName=[xxxxxx];AccountKey=[xxxxxxx]"); var config = DiagnosticMonitor.GetDefaultInitialConfiguration(); config.Logs.ScheduledTransferLogLevelFilter = LogLevel.Information; config.Logs.ScheduledTransferPeriod = TimeSpan.FromMinutes(1D); DiagnosticMonitor.Start(account, config); return base.OnStart(); } My question is why the logs are not stored automatically in "WADLogsTable"