2

Can anyone tell me if the type of behavior outlined in the memory dump from Visual Studio w3wp dump output

Is normal? for instance does the StackExchange.Redis.PhysicalConnection run that high on inclusive size (bytes)? Or is that really high?

Basically we are experiencing slowness with our web head after converting our code to run on Azure Redis from Session (we are now serializing and deserializing as needed and storing in Redis cache) but overall performance is horrible.

The requests complete but it can take a while, is that due to the single threaded nature of Redis? We are using the configuration outlined as best practice by the Azure Redis team as outlined here https://stackoverflow.com/a/28821220

What else can we look at to help increase the performance as the current performance is not acceptable as a viable replacement for our session based implementation (asp.net webforms/sql server/azure IaaS) we currently have.

PS - Serialization and Deserialization does cause a hit, we understand that IIS spoiled us with its own special memory pool for non-serialized datasets and such, but there is no way that it should cause a 300-500% increase in page loads like it is now for us.

Thoughts appreciated!

@Tim Wieman

How large are your cached objects? They can range in size, there are some datasets stored in redis.

What type of objects are they? Most objects are custom objects w/variable number of properties, some even contain collections.

What serializer are you using? We are using Newtonsoft for anything that doesn't require Rowstate and the required binary serializer for the datasets that do need rowstate.

All serialization, and subsequent deserialization, is done in code before call redis databases StringGet or StringSet.

Community
  • 1
  • 1
mt programmer
  • 139
  • 1
  • 8
  • I do not know about the memory overhead, but I do know that serialization/deserialization can cause a LOT of overhead (comparatively). I've seen it easily cause 10x perf. degradation just due to [de]serialization. How lare are your cached objects? What type of objects are they? What serializer are you using? – Tim Wieman Mar 14 '16 at 21:17
  • In addition to the other questions, I meant "How large are your cached objects"? – Tim Wieman Mar 14 '16 at 21:35
  • You might want to do some timing on your serialization/deserialization, especially on large objects and when using the "binary serializer". If that means you are using the .net BinaryFormatter, well, it can be horribly slow especially with large nested lists. Many people have good luck with protobuf.net for both speed and size of serialized data. – Tim Wieman Mar 16 '16 at 06:50

1 Answers1

1

If appears the memory was in fact extremely high, we were erroneously creating thousands of connections to Redis instead of a singleton instance of the Redis Cache.

The multiple connections were not getting cleaned up by the GC before the CPU would get to 98% and the server would become unresponsive.

We adjusted our code to ensure a single instance of the connection to Azure Redis is used for all Redis calls and have tested thoroughly.

It appears to be resolves as Azure Redis is no longer eating up memory or CPU resources.

mt programmer
  • 139
  • 1
  • 8