0

We are running a web application that handles very large sets of data, so we initially used Memory Cache.

However, we discovered that Memory Cache is not able to handle this amount of data and it Evicted some previously cached items. As a result the performance of the webapp deteriorated and we seek solutions in Azure Redis.

After some digging around, we experimented with StackExchange.Redis (v2.2.50). We are using IDatabase.StringSet() to set our data, and IDatabase.StringGet() to retrieve it. The problem is that since we need to store some complex types, we use Newtonsoft.Json to Serialize and Deserialize the object, which really affects performance.

For your reference, we have objects to be cached that when Serialized have a length of 48000000 characters.

What do you suggest?

Nick Beis
  • 31
  • 1

1 Answers1

0

It is always preferred to use small key/value sizes. We firstly need to check if we can optimize/redesign our application to use smaller values. In case we don't make application changes we need to consider other ways to get better performance. As per the below article:

  • Optimize your application for a large number of small values, rather than a few large values.
  • Increase the size of your VM to get higher bandwidth capabilities
  • Increase the number of connection objects your application uses.

I hope you are not facing timeout issues due to larger values, but if you face you might need to increase the Connection timeout accordingly.