We have an application that could potentially allocate a large number of small objects (depending on user input). Sometimes the application runs out of memory and effectively crashes.
However, if we knew that memory allocations were becoming tight there are some lower-priority objects which could be destroyed and thereby allow us to gracefully degrade the user results.
What's the best way to detect that memory for a process is running low before calls to 'new' actually fail? We could call API functions like GetProcessWorkingSetSize()
or GetProcessMemoryInfo()
but how do you know when the limits on a given machine are being reached (e.g. with 80% of maximum allocations)?