For starters let me just throw it out there that I know the code below is not thread safe (correction: might be). What I am struggling with is finding an implementation that
Just uploaded sample library to address issue for .Net 2.0.
Take a look on this repo:
RedisLazyCache
I'm using Redis cache but it also failover or just Memorycache if Connectionstring is missing.
It's based on LazyCache library that guarantees single execution of callback for write in an event of multi threading trying to load and save data specially if the callback are very expensive to execute.
The default MS-provided MemoryCache
is entirely thread safe. Any custom implementation that derives from MemoryCache
may not be thread safe. If you're using plain MemoryCache
out of the box, it is thread safe. Browse the source code of my open source distributed caching solution to see how I use it (MemCache.cs):
https://github.com/haneytron/dache/blob/master/Dache.CacheHost/Storage/MemCache.cs
As mentioned by @AmitE at the answer of @pimbrouwers, his example is not working as demonstrated here:
class Program
{
static async Task Main(string[] args)
{
var cache = new MemoryCache(new MemoryCacheOptions());
var tasks = new List<Task>();
var counter = 0;
for (int i = 0; i < 10; i++)
{
var loc = i;
tasks.Add(Task.Run(() =>
{
var x = GetOrAdd(cache, "test", TimeSpan.FromMinutes(1), () => Interlocked.Increment(ref counter));
Console.WriteLine($"Interation {loc} got {x}");
}));
}
await Task.WhenAll(tasks);
Console.WriteLine("Total value creations: " + counter);
Console.ReadKey();
}
public static T GetOrAdd<T>(IMemoryCache cache, string key, TimeSpan expiration, Func<T> valueFactory)
{
return cache.GetOrCreate(key, entry =>
{
entry.SetSlidingExpiration(expiration);
return new Lazy<T>(valueFactory, LazyThreadSafetyMode.ExecutionAndPublication);
}).Value;
}
}
Output:
Interation 6 got 8
Interation 7 got 6
Interation 2 got 3
Interation 3 got 2
Interation 4 got 10
Interation 8 got 9
Interation 5 got 4
Interation 9 got 1
Interation 1 got 5
Interation 0 got 7
Total value creations: 10
It seems like GetOrCreate
returns always the created entry. Luckily, that's very easy to fix:
public static T GetOrSetValueSafe<T>(IMemoryCache cache, string key, TimeSpan expiration,
Func<T> valueFactory)
{
if (cache.TryGetValue(key, out Lazy<T> cachedValue))
return cachedValue.Value;
cache.GetOrCreate(key, entry =>
{
entry.SetSlidingExpiration(expiration);
return new Lazy<T>(valueFactory, LazyThreadSafetyMode.ExecutionAndPublication);
});
return cache.Get<Lazy<T>>(key).Value;
}
That works as expected:
Interation 4 got 1
Interation 9 got 1
Interation 1 got 1
Interation 8 got 1
Interation 0 got 1
Interation 6 got 1
Interation 7 got 1
Interation 2 got 1
Interation 5 got 1
Interation 3 got 1
Total value creations: 1
As others have stated, MemoryCache is indeed thread safe. The thread safety of the data stored within it however, is entirely up to your using's of it.
To quote Reed Copsey from his awesome post regarding concurrency and the ConcurrentDictionary<TKey, TValue>
type. Which is of course applicable here.
If two threads call this [GetOrAdd] simultaneously, two instances of TValue can easily be constructed.
You can imagine that this would be especially bad if TValue
is expensive to construct.
To work your way around this, you can leverage Lazy<T>
very easily, which coincidentally is very cheap to construct. Doing this ensures if we get into a multithreaded situation, that we're only building multiple instances of Lazy<T>
(which is cheap).
GetOrAdd()
(GetOrCreate()
in the case of MemoryCache
) will return the same, singular Lazy<T>
to all threads, the "extra" instances of Lazy<T>
are simply thrown away.
Since the Lazy<T>
doesn't do anything until .Value
is called, only one instance of the object is ever constructed.
Now for some code! Below is an extension method for IMemoryCache
which implements the above. It arbitrarily is setting SlidingExpiration
based on a int seconds
method param. But this is entirely customizable based on your needs.
Note this is specific to .netcore2.0 apps
public static T GetOrAdd<T>(this IMemoryCache cache, string key, int seconds, Func<T> factory)
{
return cache.GetOrCreate<T>(key, entry => new Lazy<T>(() =>
{
entry.SlidingExpiration = TimeSpan.FromSeconds(seconds);
return factory.Invoke();
}).Value);
}
To call:
IMemoryCache cache;
var result = cache.GetOrAdd("someKey", 60, () => new object());
To perform this all asynchronously, I recommend using Stephen Toub's excellent AsyncLazy<T>
implementation found in his article on MSDN. Which combines the builtin lazy initializer Lazy<T>
with the promise Task<T>
:
public class AsyncLazy<T> : Lazy<Task<T>>
{
public AsyncLazy(Func<T> valueFactory) :
base(() => Task.Factory.StartNew(valueFactory))
{ }
public AsyncLazy(Func<Task<T>> taskFactory) :
base(() => Task.Factory.StartNew(() => taskFactory()).Unwrap())
{ }
}
Now the async version of GetOrAdd()
:
public static Task<T> GetOrAddAsync<T>(this IMemoryCache cache, string key, int seconds, Func<Task<T>> taskFactory)
{
return cache.GetOrCreateAsync<T>(key, async entry => await new AsyncLazy<T>(async () =>
{
entry.SlidingExpiration = TimeSpan.FromSeconds(seconds);
return await taskFactory.Invoke();
}).Value);
}
And finally, to call:
IMemoryCache cache;
var result = await cache.GetOrAddAsync("someKey", 60, async () => new object());
While MemoryCache is indeed thread safe as other answers have specified, it does have a common multi threading issue - if 2 threads try to Get
from (or check Contains
) the cache at the same time, then both will miss the cache and both will end up generating the result and both will then add the result to the cache.
Often this is undesirable - the second thread should wait for the first to complete and use its result rather than generating results twice.
This was one of the reasons I wrote LazyCache - a friendly wrapper on MemoryCache that solves these sorts of issues. It is also available on Nuget.
Check out this link: http://msdn.microsoft.com/en-us/library/system.runtime.caching.memorycache(v=vs.110).aspx
Go to the very bottom of the page (or search for the text "Thread Safety").
You will see:
^ Thread Safety
This type is thread safe.