6

I wrote the following test (actually used in a wider context)

IntPtr x = Marshal.AllocHGlobal(100000000);

Console.Write("Press any key to continue . . . ");
Console.ReadKey(true);

Marshal.FreeHGlobal(x);

Console.ReadKey(true);

Why doesn't the task manager show any sign of the allocated 100 megabytes before the first key press? If this is by design, how else can I test the consumption of unmanaged heap memory?

oliver
  • 2,771
  • 15
  • 32
  • Interesting fact, didn't know (nor notice) that during all my years of programming. – oliver Mar 17 '18 at 08:09
  • Interesting question and comment, I know I'm not in a position to suggest to you what is better @HansPassant but I think if you add your comment as answer it'd be much better. I really didn't know this information before! This will help many people – Rickless Mar 17 '18 at 08:55

1 Answers1

9

You are getting some insight in how your operating system works. This behavior is not specific to Marshal.AllocHGlobal(), try this code for example:

    static void Main(string[] args) {
        var arr = new byte[100000000];
        Console.ReadKey(true);
    }

Task Manager by default shows you how much RAM you use. But your OS, like many others, is a demand-paged virtual memory operating system. You haven't demanded anything yet. All you did is allocate virtual memory. Just address space, it stays virtual in this test, just numbers to the processor. One for each 4096 bytes. The demand doesn't happen until you actually access the array. Add:

        for (int ix = 0; ix < arr.Length; ix += 4096) {
            byte dummy = arr[ix];
        }

Bam, now you see it zoom up. Not necessarily to 100 megabytes, but most machines now have enough RAM to not need any pages to be swapped back out to the paging file to provide enough storage.

Otherwise a good reminder why Task Manager is not a very good memory profiler. How much RAM your program consumes is pretty irrelevant. If you use too much then you'll notice it well enough, your program slows down a lot.

Task Manager can show you the side-effect of the original code, you have to add the "Commit size" column. Named differently on earlier Windows versions, fuzzy memory, I think it was VM Size. Commit size measures how much space is reserved in the paging file. Backup storage for the address space, reserved so the RAM content can be stored when the OS needs to unmap pages to provide RAM elsewhere. Windows does not permit over-committing memory, it is the way you get OOM in a 64-bit program when it can't grow the paging file quickly enough.

Hans Passant
  • 922,412
  • 146
  • 1,693
  • 2,536
  • Again thanks for the excellent answer. It didn't ever notice this probably because normally when using c# or even c++ (in debug mode) the allocated memory gets initialized (with zeroes) and I suppose this is among the demand kind of things. What also surprised me is that if you write to half the reserved memory then this fraction is also what shows up in the main memory column of task manager. I didn't know that memory management is *that* dynamic (instead of being an all or nothing kind of thing). – oliver Mar 17 '18 at 17:07
  • 4
    It is not, you get the zero-initialization in C# entirely for free. Evident when you run the code in this post. It is one of the basic OS duties, it must initialize RAM pages to prevent programs from seeing secrets. C++ with the debug heap enabled is different because it doesn't use 0 – Hans Passant Mar 17 '18 at 17:24