I've an application where we store datasets in HttpRuntime.Cache for Select operations ONLY (alltogether ~20MB of data if the dataset is saved as xml).
We migrated to a new Windows 2008 server x64, where IIS site based Output caching was enabled. Our application pool is set to x64 mode as well. All went well, the application was running with some serious load and about ~ 2GB of memory usage.
With no changes to the application, and no additional traffic, the application started to eat up all memory on the server, that is currently 7GB.
I've tried to disable Output caching, but it didn't help at all. The only way to deal with the problem is to recycle the application pool, but this solution results in the loss of all user sessions.
In my understanding, HttpRuntime.Cache items are global for the application pool. We're only Inserting the items to the Cache on the Application Load event, so there should be no changes whatsoever to the cache items.
With disabled output caching and no additional inserts to the Cache, what could cause the unexpected memory growth?