1

I'm starting to learn about GC and finalization, and I've come across quite a simple example where the behaviour of the application is quite unexpected to me.

(Note: I'm aware that finalizers should be used only be used with unmanaged resources and using the disposable pattern, I just want to understand what is going on here.)

This is a simple console app that generates a "saw-tooth" pattern of memory. The memory rises to around 90MB and then does a GC, drops and begins to rise again, never going beyond 90MB.

    class Program
    {
        static void Main(string[] args)
        {
            for (int i = 0; i < 100000; i++)
            {
                MemoryWaster mw = new MemoryWaster(i);
                Thread.Sleep(250);
            }
        }
    }

    public class MemoryWaster
    {
        long l = 0;
        long[] array = new long[1000000];

        public MemoryWaster(long l)
        {
            this.l = l;
        }

        //~MemoryWaster()
        //{
        //    Console.WriteLine("Finalizer called.");
        //}
    }

If I remove the comment with the finalizer, the behaviour is very different - the application does one or two GCs at the start but then the memory increases in a linear way until it is using over 1GB of memory (at which point I terminate the application)

From what I have read, this is because instead of releasing the item, the GC moves the object to the finalization queue. The GC starts a thread to execute the finalizer methods, and then waits for another GC to remove the finalized objects. This can be an issue when the finalizer methods are very long-running but this isn't the case here.

If I manually trigger run GC.Collect() every few iterations, the app behaves as expected and I see the saw-tooth pattern of the memory getting released.

My question is - why does the large amount of memory being used by the application not trigger a GC automatically? In the example with finalizers included, would the GC ever run again after the first time?

Aleph
  • 199
  • 2
  • 13
  • The GC is triggered if there´s pressure to do so. You can´t determine when this will be the case. – MakePeaceGreatAgain Nov 18 '19 at 19:19
  • Depending on your machine 1GB may not be that much. – TaW Nov 18 '19 at 19:34
  • 1
    Adding object to finalization queue means that object is survived and moved to next next generation (from gen 0 to gen 1) – Pavel Anikhouski Nov 18 '19 at 19:48
  • Without the finalizer there was pressure to trigger GC when the application reached 90MB of memory, so why does adding the finalizer allow the application to use over 10 times the memory that was the threshold for GC before? – Aleph Nov 18 '19 at 22:34
  • 1
    It's not the goal of the GC to save memory. The goal is to provide you memory, if needed. So it will run, if you request (allocate) a big amount of memory, not if you (have the idea) free memory. But it also runs, if the application feels idle (bored) and there are a certain amount of objects marked to delete. It's unpredictable. – Holger Nov 18 '19 at 22:51

1 Answers1

0

Do not rely on Finalizers. They are a safety net that you should never get to, not the first option. If the finalizers have to clean up after you, you already messed up terribly.

I have two basic rules regarding disposeables, that always worked:

  • Never split up the creation and disposing of a instance. Create, use, dispose. All in teh same piece of code, ideally using a using block.
  • If you can not do the 1st thing - like when you are wrapping around something that Implements IDisposeable - your class implements IDisposeable for the sole purpose of relaying the Dispose call.

As for the GC:

While the GC runs, all other Threads have to be paused. This is a absolute rule. As a result the GC is quite lazy. It tries to avoid running. Indeed if it only runs once during application close, that is the ideal case.

Christopher
  • 9,634
  • 2
  • 17
  • 31