How do I clear tracked entities in entity framework

后端 未结 6 1218
不知归路
不知归路 2020-12-04 20:50

I am running some correction code that runs over a big pile of entities, as it progress its speed decreases, that is because the number of tracked entities in the context in

相关标签:
6条回答
  • 2020-12-04 21:11

    From EF Core 3.0 there is an internal API that can reset the ChangeTracker. Do not use this in production code, I mention it as it may help someone in testing depending on the scenario.

    using Microsoft.EntityFrameworkCore.Internal;
    
    _context.GetDependencies().StateManager.ResetState();
    

    As the comment on the code says;

    This is an internal API that supports the Entity Framework Core infrastructure and not subject to the same compatibility standards as public APIs. It may be changed or removed without notice in any release. You should only use it directly in your code with extreme caution and knowing that doing so can result in application failures when updating to a new Entity Framework Core release.

    0 讨论(0)
  • 2020-12-04 21:12

    I'm running a windows service that updates values every minute and I have had the same problem. I tried running @DavidSherrets solution but after a few hours this got slow as well. My solution was to simply create a new context like this for every new run. Simple but it works.

    _dbContext = new DbContext();

    0 讨论(0)
  • 2020-12-04 21:15

    I just ran into this issue, and eventually stumbled upon a better solution for those using the typical .NET Core dependency injection. You can use a scoped DbContext for each operation. That will reset DbContext.ChangeTracker so that SaveChangesAsync() won't get bogged down checking entities from past iterations. Here is an example ASP.NET Core Controller method:

        /// <summary>
        /// An endpoint that processes a batch of records.
        /// </summary>
        /// <param name="provider">The service provider to create scoped DbContexts.
        /// This is injected by DI per the FromServices attribute.</param>
        /// <param name="records">The batch of records.</param>
        public async Task<IActionResult> PostRecords(
            [FromServices] IServiceProvider provider,
            Record[] records)
        {
            // The service scope factory is used to create a scope per iteration
            var serviceScopeFactory =
                provider.GetRequiredService<IServiceScopeFactory>();
    
            foreach (var record in records)
            {
                // At the end of the using block, scope.Dispose() will be called,
                // release the DbContext so it can be disposed/reset
                using (var scope = serviceScopeFactory.CreateScope())
                {
                    var context = scope.ServiceProvider.GetService<MainDbContext>();
    
                    // Query and modify database records as needed
    
                    await context.SaveChangesAsync();
                }
            }
    
            return Ok();
        }
    

    Given that ASP.NET Core projects typically use DbContextPool, this doesn't even create/destroy the DbContext objects. (In case you were interested, DbContextPool actually calls DbContext.ResetState() and DbContext.Resurrect(), but I wouldn't recommend calling those directly from your code, as they will probably change in future releases.) https://github.com/aspnet/EntityFrameworkCore/blob/v2.2.1/src/EFCore/Internal/DbContextPool.cs#L157

    0 讨论(0)
  • 2020-12-04 21:21

    Well my opinion is that, in my experience, EF, or being any orm, does not work well under too much pressure or complex model.

    If you don't want to track, really I would say why even do orm?

    If speed is the main force, nothing beats stored procedures and good indexing.

    And beyond, if your queries are always per id consider using a nosql or perhaps sql with just key and json. This would avoid the impedance problem between classes and tables.

    For your case scenario, loading things in objects that way seems very slow to me. Really in your case, stored procedures are better because you avoid the transport of data thru the network, and sql is way faster and optimized to manage aggregation and things like that.

    0 讨论(0)
  • 2020-12-04 21:29

    You can add a method to your DbContext or an extension method that uses the ChangeTracker to detach all the Added, Modified, and Deleted entities:

    public void DetachAllEntities()
    {
        var changedEntriesCopy = this.ChangeTracker.Entries()
            .Where(e => e.State == EntityState.Added ||
                        e.State == EntityState.Modified ||
                        e.State == EntityState.Deleted)
            .ToList();
    
        foreach (var entry in changedEntriesCopy)
            entry.State = EntityState.Detached;
    }
    
    0 讨论(0)
  • 2020-12-04 21:31

    1. Possibility: detach the entry

    dbContext.Entry(entity).State = EntityState.Detached;
    

    When you detach the entry the change tracker will stop tracking it (and should result in better performance)

    See: http://msdn.microsoft.com/de-de/library/system.data.entitystate(v=vs.110).aspx

    2. Possibility: work with your own Status field + disconnected contexts

    Maybe you want to control the status of your entity independently so you can use disconnected graphs. Add a property for the entity status and transform this status into the dbContext.Entry(entity).State when performing operations (use a repository to do this)

    public class Foo
    {
        public EntityStatus EntityStatus { get; set; }
    }
    
    public enum EntityStatus
    {
        Unmodified,
        Modified,
        Added
    }
    

    See following link for an example: https://www.safaribooksonline.com/library/view/programming-entity-framework/9781449331825/ch04s06.html

    0 讨论(0)
提交回复
热议问题