I'd like to describe strange issue I've noticed while analyzing my asp.net application in production and ask for some advice or opinion on the following matter.
Application usually runs with some 80-90 MB of memory footprint. This seems stable since no memory leaks have been detected so far - no slight increase in memory usage over time. Yet, problem occurs when application pool recycles (I'm using shared hosting and judging by logs it occurs either when app is idle for 20 mins or every ~30 hours - something like that). The issue is that used memory almost doubles for some period on recycle - it goes to some 160-170 MBs without any explanation. This is confusing, since it is common claim that recycling should purge the memory and all other resources - at least I get it that way. System holds this amount of memory for some 7-8 hours and then memory usage drops to it's usual level of 90-100 MB, again, with no apparent reason (at least not know to me). All the time, application seems to work well - no significant delays or troubles with site availability - to the users everything seems OK, no complaints so far. When looking at the memory consumption over time graph - it looks almost like a step function.
The important thing is that I haven't been able to reproduce this sort of behavior in my testing environment. Occasionally, I've been getting notes from provider administrators that my app is using more resources than allowed and this really bugs me.
So, what I would like to know - is there any possible scenario where application pool recycling does not release all memory resources ? Is there any advice or guideline what I should focus on ? I'm not an expert in this area, but I've been reading about things like overlapping recycling, serialization problems on recycle and couple more issues ... Any ideas ? Similar experience ?
Thanks