I'm running a load test on a .NET
web application over a LAN. The server hosting the web app is a VM, Windows Server 2008 R2, with 2 GB of RAM and a 3GB limit for virtual memory. No other web applications are running on it
The test runs for 1 hour and 40 minutes and increases load every 10 minutes.
I record memory usage with perfmon
during the load test and I see that the memory starts at 1.5GBs and rises until it hits a limit of 3.37GBs after 1 hour and 20 minutes, then the memory drops to 1.5 GBs again.
I don't understand why this is happening. Is windows Memory Management doing this, and if so why?