You could cache it on the first call with a TTL, let it invalidate, and then the next call will get it and cache it back again. The problem with this is that you are slowing down your thread while it has to go fetch the data as it is unavailable, and multiple threads will wait for it (assuming you lock the read to prevent flooding).
One way to get around the first load issue is to prime your cache on application start up. This assures that when your application is ready to be used, the data is already loaded up and will be fast. Create a quick interface like ICachePrimer { void Prime() }, scan your assemblies for it, resolve them, then run them.
The way I like to get around the empty cache on invalidation issue is to refresh the data before it is removed. To easily do this in .Net, you can utilize the MemoryCache's CacheItemPolicy callbacks.
- UpdateCallback occurs before the item is removed, and allows you to refresh the item.
- RemovedCallback occurs after the item has been removed.
In the example below, my CachedRepository will refresh the cached item when it is invalidated. Other threads will continue to receive the "old" value until the refresh completes.
You could also use these callbacks to trigger a domain event, a redis publish, or to the web browser via a websocket publish through signalR or socket.io. A Javascript object that gets synchronized via a server-side cache invalidation, nice!
Nice Article. How could we do the same in .NetCode. There is no ObjectCache and MemoryCache has no UpdateCallback. The equivalent RegisterPostEvictionCallback calls after cache expired, and by that time cache is already empty.
ReplyDeleteI encourage you to read this text it is fun described ... 2020 Keystone RV Fuzion 410
ReplyDelete