0

This question is not really related to any specific code or even language.

If you allocate huge (exceeding phisical memory) amount of memory on Windows it causes entire operating system to become fully unresponsive - including mouse cursor which typically was able to move even with entire system crashed.

Working Set API seem to not solve the problem - it seems that all applications have an initial max working set size already set to a rather low level.

I hoped memory mapped files (via boost api) would help OS make better decisions about page loading/unloading - but again even single pass trough large data freezes the system.

Are there any magic WinAPI calls or other good programming practice (other than manual management of entire commited memory and manual data caching in files) that would keep the operating system and other applications reasonably stable while using such huge amount of data?

Noxitu
  • 346
  • 2
  • 6
  • Good programming practice would be not to store everything in memory at once. Look at streaming algorithms – BlackBear Feb 12 '19 at 15:04
  • I don't consider this a viable answer. For purpose of this question assume that all data I keep in memory will be used again in future. Either way it must be dumped to drive and read again (but from the point of view of algorithm this is still simply having data in memory; just bigger and slower one) - no matter if manually or by the OS into pagefile. The question is if it is possible to hint OS so that it does it in a "acceptable" way or is it just bad at this job and nothing can be done about it. – Noxitu Feb 12 '19 at 16:24

0 Answers0