0

I am learning the process of caching in Asp.net core and found that we use

  1. In-memory cache: single server instance
  2. Distributed cache: when the application is hosted in multiple servers.

I was wondering if we should use distributed cache for a single server application when the size of data to be cached is high. If yes I have another question. What is the size limit where we should switch from in-memory to distributed cache like redis or SQL?

I was trying to know internal mechanism of how in-memory cache is saved within the application to address the above questions. Who is responsible of this cache in .net core framework and how is it handled? Is it the framework that handles this cache? Where exactly is this stored?(in app pool and uses the RAM?) But I found answers to these nowhere and hence my last stop Stack Overflow.

What I think are possibilities?

The in-memory cache extends as needed without any background checks of available RAM for application/s and could cause the crash because of 100% utilisation. Is that correct?

Or is it that whenever cache size exceeds a limit the old cache items are removed automatically? If yes what is that limit and what is it dependent on?

There might be knowledge gaps in the question, please correct me wherever I am wrong in this question.

  • of course you can use distributed cache for your single server instance but that's unnecessary. If you can use in-memory cache, that would be the best option for performance and minimal cost for infrastructure. The memory cache should take care of the maximum size it can handle and auto-evict the less prioritized items from the memory. So if it's designed correctly, there will be no error at all but the caching effect may be reduced due to the high rate of eviction. That's when you need to upgrade the hardware. – King King Feb 10 '21 at 20:09
  • @King King Thanks for the quick answer for 2nd part of the question. Question: The memory cache should take care of the max size it can handle but what is it's limit and microsoft doc says there is no inherent limit within itself and it just extends itself endlessly. Also it didn't mention what is the limit and it's dependencies like main memory or RAM or something else and when does the cache eviction start. It could be me who is lacking this knowledge, so please let me know if this concept is generic computer science concept (how mem cache is handled) and hence microsoft doc didn't mention. – Mahesh Varma Feb 10 '21 at 22:32
  • where did you find this `microsoft doc says there is no inherent limit within itself and it just extends itself endlessly`? that's true up to some point but in reality, you can see that it's just impossible because the hardware is limited. I think the logic behind to decide when an item should be evicted is complicated, especially when its limit is close to the hardware (ram)'s capability. If you implement your own cache, you can use a simple strategy like limit the count to just 10000 items or the size to 1GB. That way there may be possible error of `OutOfMemory` although it's hard to happen. – King King Feb 11 '21 at 06:04
  • @king king Here in Use setsize, size to limit cache size section : [https://learn.microsoft.com/en-us/aspnet/core/performance/caching/memory?view=aspnetcore-5.0](https://learn.microsoft.com/en-us/aspnet/core/performance/caching/memory?view=aspnetcore-5.0) – Mahesh Varma Feb 11 '21 at 06:34
  • it's a pity that the memory cache in asp.net core is not that ideal and smart as what I said. You have to set some options to make it work (set size & size limit) by auto-evicting less prioritized items. That means there may be out-of-memory exception if you configure the limit wrong and the actual cached memory overreaches the system available memory. However I think you can estimate the amount of cached memory beforehand (when testing & running in production) to ensure that such an error should almost never occur. – King King Feb 11 '21 at 09:10

0 Answers0