We've recently switched out in-memory cache for Azure Redis cache. It was simple enough since the API is the same. All good and well, except for a much bigger latency. I suspect it has something to do with improper caching techniques on our behalf. But since that was not a problem when we were using in-memory cache, I'm hoping this can be fixed with a Redis configuration.
Our data structure mirrors the database schema. One key for each record. For example to fetch 100 users, we query the cache 100 times. Since Redis likes to work with smaller chunks, that seems to be the way to go.
I've read about pipelining and multiplexing. Is that something that might fix this problem?
Example of how we do it:
services.AddDistributedRedisCache(action =>
{
action.Configuration = "...";
action.InstanceName = "...";
});
This gets called in each loop iteration.
byte[] cacheData = await Cache.GetAsync(key);
Note: The Azure Redis cache is located in the same region as our web server.