Caching in C#/.Net

前端 未结 12 1232
离开以前
离开以前 2020-11-30 02:22

I wanted to ask you what is the best approach to implement a cache in C#? Is there a possibility by using given .NET classes or something like that? Perhaps something like a

相关标签:
12条回答
  • 2020-11-30 03:03

    For Local Stores

    • .NET MemoryCache
    • NCache Express
    • AppFabric Caching
    • ...
    0 讨论(0)
  • 2020-11-30 03:05

    You could use a Hashtable

    it has very fast lookups, no key collisions and your data will not garbage collected

    0 讨论(0)
  • 2020-11-30 03:06

    As mentioned in other answers, the default choice using the .NET Framework is MemoryCache and the various related implementations in Microsoft NuGet packages (e.g. Microsoft.Extensions.Caching.MemoryCache). All of these caches bound size in terms of memory used, and attempt to estimate memory used by tracking how total physical memory is increasing relative to the number of cached objects. A background thread then periodically 'trims' entries.

    MemoryCache etc. share some limitations:

    1. Keys are strings, so if the key type is not natively string, you will be forced to constantly allocate strings on the heap. This can really add up in a server application when items are 'hot'.
    2. Has poor 'scan resistance' - e.g. if some automated process is rapidly looping through all the items in that exist, the cache size can grow too fast for the background thread to keep up. This can result in memory pressure, page faults, induced GC or when running under IIS, recycling the process due to exceeding the private bytes limit.
    3. Does not scale well with concurrent writes.
    4. Contains perf counters that cannot be disabled (which incur overhead).

    Your workload will determine the degree to which these things are problematic. An alternative approach to caching is to bound the number of objects in the cache (rather than estimating memory used). A cache replacement policy then determines which object to discard when the cache is full.

    Below is the source code for a simple cache with least recently used eviction policy:

    public sealed class ClassicLru<K, V>
    {
        private readonly int capacity;
        private readonly ConcurrentDictionary<K, LinkedListNode<LruItem>> dictionary;
        private readonly LinkedList<LruItem> linkedList = new LinkedList<LruItem>();
    
        private long requestHitCount;
        private long requestTotalCount;
    
        public ClassicLru(int capacity)
            : this(Defaults.ConcurrencyLevel, capacity, EqualityComparer<K>.Default)
        { 
        }
    
        public ClassicLru(int concurrencyLevel, int capacity, IEqualityComparer<K> comparer)
        {
            if (capacity < 3)
            {
                throw new ArgumentOutOfRangeException("Capacity must be greater than or equal to 3.");
            }
    
            if (comparer == null)
            {
                throw new ArgumentNullException(nameof(comparer));
            }
    
            this.capacity = capacity;
            this.dictionary = new ConcurrentDictionary<K, LinkedListNode<LruItem>>(concurrencyLevel, this.capacity + 1, comparer);
        }
    
        public int Count => this.linkedList.Count;
    
        public double HitRatio => (double)requestHitCount / (double)requestTotalCount;
    
        ///<inheritdoc/>
        public bool TryGet(K key, out V value)
        {
            Interlocked.Increment(ref requestTotalCount);
    
            LinkedListNode<LruItem> node;
            if (dictionary.TryGetValue(key, out node))
            {
                LockAndMoveToEnd(node);
                Interlocked.Increment(ref requestHitCount);
                value = node.Value.Value;
                return true;
            }
    
            value = default(V);
            return false;
        }
    
        public V GetOrAdd(K key, Func<K, V> valueFactory)
        {
            if (this.TryGet(key, out var value))
            {
                return value;
            }
    
            var node = new LinkedListNode<LruItem>(new LruItem(key, valueFactory(key)));
    
            if (this.dictionary.TryAdd(key, node))
            {
                LinkedListNode<LruItem> first = null;
    
                lock (this.linkedList)
                {
                    if (linkedList.Count >= capacity)
                    {
                        first = linkedList.First;
                        linkedList.RemoveFirst();
                    }
    
                    linkedList.AddLast(node);
                }
    
                // Remove from the dictionary outside the lock. This means that the dictionary at this moment
                // contains an item that is not in the linked list. If another thread fetches this item, 
                // LockAndMoveToEnd will ignore it, since it is detached. This means we potentially 'lose' an 
                // item just as it was about to move to the back of the LRU list and be preserved. The next request
                // for the same key will be a miss. Dictionary and list are eventually consistent.
                // However, all operations inside the lock are extremely fast, so contention is minimized.
                if (first != null)
                {
                    dictionary.TryRemove(first.Value.Key, out var removed);
    
                    if (removed.Value.Value is IDisposable d)
                    {
                        d.Dispose();
                    }
                }
    
                return node.Value.Value;
            }
    
            return this.GetOrAdd(key, valueFactory);
        }
    
        public bool TryRemove(K key)
        {
            if (dictionary.TryRemove(key, out var node))
            {
                // If the node has already been removed from the list, ignore.
                // E.g. thread A reads x from the dictionary. Thread B adds a new item, removes x from 
                // the List & Dictionary. Now thread A will try to move x to the end of the list.
                if (node.List != null)
                {
                    lock (this.linkedList)
                    {
                        if (node.List != null)
                        {
                            linkedList.Remove(node);
                        }
                    }
                }
    
                if (node.Value.Value is IDisposable d)
                {
                    d.Dispose();
                }
    
                return true;
            }
    
            return false;
        }
    
        // Thead A reads x from the dictionary. Thread B adds a new item. Thread A moves x to the end. Thread B now removes the new first Node (removal is atomic on both data structures).
        private void LockAndMoveToEnd(LinkedListNode<LruItem> node)
        {
            // If the node has already been removed from the list, ignore.
            // E.g. thread A reads x from the dictionary. Thread B adds a new item, removes x from 
            // the List & Dictionary. Now thread A will try to move x to the end of the list.
            if (node.List == null)
            {
                return;
            }
    
            lock (this.linkedList)
            {
                if (node.List == null)
                {
                    return;
                }
    
                linkedList.Remove(node);
                linkedList.AddLast(node);
            }
        }
    
        private class LruItem
        {
            public LruItem(K k, V v)
            {
                Key = k;
                Value = v;
            }
    
            public K Key { get; }
    
            public V Value { get; }
        }
    }
    

    This is just to illustrate a thread safe cache - it probably has bugs and can be a bottleneck under heavy concurrent workloads (e.g. in a web server).

    A thoroughly tested, production ready, scalable concurrent implementation is a bit beyond a stack overflow post. To solve this in my projects, I implemented a thread safe pseudo LRU (think concurrent dictionary, but with constrained size). Performance is very close to a raw ConcurrentDictionary, ~10x faster than MemoryCache, ~10x better concurrent throughput than ClassicLru above, and better hit rate. A detailed performance analysis provided in the github link below.

    Usage looks like this:

    int capacity = 666;
    var lru = new ConcurrentLru<int, SomeItem>(capacity);
    
    var value = lru.GetOrAdd(1, (k) => new SomeItem(k));
    

    GitHub: https://github.com/bitfaster/BitFaster.Caching

    Install-Package BitFaster.Caching
    
    0 讨论(0)
  • 2020-11-30 03:12

    - Memory cache implementation for .Net core

    public class CachePocRepository : ICachedEmployeeRepository
        {
            private readonly IEmployeeRepository _employeeRepository;
            private readonly IMemoryCache _memoryCache;
    
            public CachePocRepository(
                IEmployeeRepository employeeRepository,
                IMemoryCache memoryCache)
            {
                _employeeRepository = employeeRepository;
                _memoryCache = memoryCache;
            }
    
            public async Task<Employee> GetEmployeeDetailsId(string employeeId)
            {
                _memoryCache.TryGetValue(employeeId, out Employee employee);
    
                if (employee != null)
                {
                    return employee;
                }
    
                employee = await _employeeRepository.GetEmployeeDetailsId(employeeId);
                
                _memoryCache.Set(employeeId,
                    employee,
                    new MemoryCacheEntryOptions()
                    {
                        AbsoluteExpiration = DateTimeOffset.UtcNow.AddDays(7),
                    });
    
                return employee;
    
            }
    
    0 讨论(0)
  • 2020-11-30 03:13

    If you're using ASP.NET, you could use the Cache class (System.Web.Caching).

    Here is a good helper class: c-cache-helper-class

    If you mean caching in a windows form app, it depends on what you're trying to do, and where you're trying to cache the data.

    We've implemented a cache behind a Webservice for certain methods
    (using the System.Web.Caching object.).

    However, you might also want to look at the Caching Application Block. (See here) that is part of the Enterprise Library for .NET Framework 2.0.

    0 讨论(0)
  • 2020-11-30 03:13

    You can use the ObjectCache.

    See http://msdn.microsoft.com/en-us/library/system.runtime.caching.objectcache.aspx

    0 讨论(0)
提交回复
热议问题