I assume this code has concurrency issues:
const string CacheKey = \"CacheKey\";
static string GetCachedData()
{
string expensiveString =null;
if (Me
To avoid the global lock, you can use SingletonCache to implement one lock per key, without exploding memory usage (the lock objects are removed when no longer referenced, and acquire/release is thread safe guaranteeing that only 1 instance is ever in use via compare and swap).
Using it looks like this:
SingletonCache keyLocks = new SingletonCache();
const string CacheKey = "CacheKey";
static string GetCachedData()
{
string expensiveString =null;
if (MemoryCache.Default.Contains(CacheKey))
{
return MemoryCache.Default[CacheKey] as string;
}
// double checked lock
using (var lifetime = keyLocks.Acquire(url))
{
lock (lifetime.Value)
{
if (MemoryCache.Default.Contains(CacheKey))
{
return MemoryCache.Default[CacheKey] as string;
}
cacheItemPolicy cip = new CacheItemPolicy()
{
AbsoluteExpiration = new DateTimeOffset(DateTime.Now.AddMinutes(20))
};
expensiveString = SomeHeavyAndExpensiveCalculation();
MemoryCache.Default.Set(CacheKey, expensiveString, cip);
return expensiveString;
}
}
}
Code is here on GitHub: https://github.com/bitfaster/BitFaster.Caching
Install-Package BitFaster.Caching
There is also an LRU implementation that is lighter weight than MemoryCache, and has several advantages - faster concurrent reads and writes, bounded size, no background thread, internal perf counters etc. (disclaimer, I wrote it).