Locking pattern for proper use of .NET MemoryCache

前端 未结 9 2105
栀梦
栀梦 2020-11-30 17:13

I assume this code has concurrency issues:

const string CacheKey = \"CacheKey\";
static string GetCachedData()
{
    string expensiveString =null;
    if (Me         


        
9条回答
  •  星月不相逢
    2020-11-30 17:59

    Its a bit late, however... Full implementation:

        [HttpGet]
        public async Task GetPageFromUriOrBody(RequestQuery requestQuery)
        {
            log(nameof(GetPageFromUriOrBody), nameof(requestQuery));
            var responseResult = await _requestQueryCache.GetOrCreate(
                nameof(GetPageFromUriOrBody)
                , requestQuery
                , (x) => getPageContent(x).Result);
            return Request.CreateResponse(System.Net.HttpStatusCode.Accepted, responseResult);
        }
        static MemoryCacheWithPolicy _requestQueryCache = new MemoryCacheWithPolicy();
    

    Here is getPageContent signature:

    async Task getPageContent(RequestQuery requestQuery);
    

    And here is the MemoryCacheWithPolicy implementation:

    public class MemoryCacheWithPolicy
    {
        static ILogger _nlogger = new AppLogger().Logger;
        private MemoryCache _cache = new MemoryCache(new MemoryCacheOptions() 
        {
            //Size limit amount: this is actually a memory size limit value!
            SizeLimit = 1024 
        });
    
        /// 
        /// Gets or creates a new memory cache record for a main data
        /// along with parameter data that is assocciated with main main.
        /// 
        /// Main data cache memory key.
        /// Parameter model that assocciated to main model (request result).
        /// A delegate to create a new main data to cache.
        /// 
        public async Task GetOrCreate(object key, TParameter param, Func createCacheData)
        {
            // this key is used for param cache memory.
            var paramKey = key + nameof(param);
    
            if (!_cache.TryGetValue(key, out TResult cacheEntry))
            {
                // key is not in the cache, create data through the delegate.
                cacheEntry = createCacheData(param);
                createMemoryCache(key, cacheEntry, paramKey, param);
    
                _nlogger.Warn(" cache is created.");
            }
            else
            {
                // data is chached so far..., check if param model is same (or changed)?
                if(!_cache.TryGetValue(paramKey, out TParameter cacheParam))
                {
                    //exception: this case should not happened!
                }
    
                if (!cacheParam.Equals(param))
                {
                    // request param is changed, create data through the delegate.
                    cacheEntry = createCacheData(param);
                    createMemoryCache(key, cacheEntry, paramKey, param);
                    _nlogger.Warn(" cache is re-created (param model has been changed).");
                }
                else
                {
                    _nlogger.Trace(" cache is used.");
                }
    
            }
            return await Task.FromResult(cacheEntry);
        }
        MemoryCacheEntryOptions createMemoryCacheEntryOptions(TimeSpan slidingOffset, TimeSpan relativeOffset)
        {
            // Cache data within [slidingOffset] seconds, 
            // request new result after [relativeOffset] seconds.
            return new MemoryCacheEntryOptions()
    
                // Size amount: this is actually an entry count per 
                // key limit value! not an actual memory size value!
                .SetSize(1)
    
                // Priority on removing when reaching size limit (memory pressure)
                .SetPriority(CacheItemPriority.High)
    
                // Keep in cache for this amount of time, reset it if accessed.
                .SetSlidingExpiration(slidingOffset)
    
                // Remove from cache after this time, regardless of sliding expiration
                .SetAbsoluteExpiration(relativeOffset);
            //
        }
        void createMemoryCache(object key, TResult cacheEntry, object paramKey, TParameter param)
        {
            // Cache data within 2 seconds, 
            // request new result after 5 seconds.
            var cacheEntryOptions = createMemoryCacheEntryOptions(
                TimeSpan.FromSeconds(2)
                , TimeSpan.FromSeconds(5));
    
            // Save data in cache.
            _cache.Set(key, cacheEntry, cacheEntryOptions);
    
            // Save param in cache.
            _cache.Set(paramKey, param, cacheEntryOptions);
        }
        void checkCacheEntry(object key, string name)
        {
            _cache.TryGetValue(key, out T value);
            _nlogger.Fatal("Key: {0}, Name: {1}, Value: {2}", key, name, value);
        }
    }
    

    nlogger is just nLog object to trace MemoryCacheWithPolicy behavior. I re-create the memory cache if request object (RequestQuery requestQuery) is changed through the delegate (Func createCacheData) or re-create when sliding or absolute time reached their limit. Note that everything is async too ;)

提交回复
热议问题