I know there's a maximum memory limit in Azure Caching but is there a maximum object count as well? It feels like the cache would get slower as the number of keys is increasing.
Background:
I need to keep some numbers in memory for each user (summaries which is expensive to calculate from db but cheap to increment in memory on the fly). As the concurrent users grows I'm worried I might outgrow the cache if there's a limit.
My intended solution:
Let's say i have to keep the Int64
'value1
' and 'value2
' in memory for each user.
Cache items as userN_value1, userN_value2, [...]
and call DataCache.Increment to update the value of each counter when changed like this:
DataCache.Increment("user1_value1", 2500, 0, "someregion");
As the amount of users grow this may result in a lot of items. Is this something I should worry about? Is there a better approach I haven't thought of?