0

I haven't been programming for very long and am new to C/C++. I've always used C# in the past, but I've switched to native code to write my first Win32 API application. I started in C but became increasingly maddened by trying to work without classes.

I've started porting some of the C to C++. Initially, I had the project setup as VC++ using .NET 4. About 25% of the way through porting, I ran the program in debug mode. I was shocked. According to Task Manager, the program uses more than twice the amount of memory as the C version. I then de-.NETified the project and was even more surprised: using only the STL, my application needed 33% of the memory required for the .NET incarnation. Here are the numbers:

Memory Usage
Complete App in C:...............940kb
25% Port in VC++/.NET4:......2.1MB
25% Port in C++:...................700kb

If I put the other features of .NET aside, I'm left wondering what the advantage of managed memory is if it trebles the memory footprint. Is it the preemption of leaks? Safer pointers?

Thanks

0x1mason
  • 757
  • 8
  • 20
  • you can't assume that it triples anything based on a single data point. You could just as easily argue that "why not use .NET, if it gives you access to so much extra functionality, and only takes 1400KB extra memory". You don't know if it is constant overhead, or if it scales so it always uses 3 times as much memory regardless of the size of the app. – jalf Jan 14 '11 at 16:08
  • Fair enough, but I do know it trebles something: the amount of memory used by the application in its current state when implementing .NET vs the amount it uses when it's entirely written in C. Or at least that amount as reported by Task Manager. Whether or not it will always increase memory usage by a factor of three isn't my question. Moreover, I expressly bracketed out the other .NET features. I take your point though because, e.g., there may be a point at which the managed application uses less memory than the unmanaged one or that it becomes proportionately insignificant. – 0x1mason Jan 14 '11 at 18:47
  • Don't worry about this. What your seeing is the runtime. The standard C(++) runtime is very small and barely does anything (it doesnt even have a string trim.....). The .NET runtime is much larger. Its hard to measure and i am unsure how to accurately measure. What i do is just ignore this and do it in .NET (i find .net has more libs and is generally more useful) and only do C++ when something is performance critical. What i mean by performance critical is it takes more then 5mins to do something. Like crunching numbers or 'compiling' data. (i wrote a custom allocator for this project) –  Apr 22 '11 at 05:07

6 Answers6

3

Some of the memory footprint you are seeing will belong to the .Net Framework's runtime too.

Garbage collection helps eliminate a lot of programming overhead such as remembering to dealloc memory or to release objects. I've programmed a lot in C, Objective C and C# and I can certainly say that whilst garbage collection doesn't solve every issue, it does remove a lot of responsibility on me and allows me to concentrate on logic rather than hunting for an exra release instruction.

I also think that the allocation mechanism for new objects is very fast as it always adds to the end of the memory address rather than having to search for blocks large enough.

I should say that whilst garbage collection is great, it's not perfect and has it's own limitations in .Net, especially when dealing with unmanaged code i.e. not calling dispose can leave memory lingering around until finalization. The collection cycles can also be costly as all activity is suspended for the duration of garbage collection.

As with any technology, understand the benefits and pitfalls and choose appropriately.

Tomas McGuinness
  • 7,651
  • 3
  • 28
  • 40
  • Thanks, that's very helpful. In this case, I was planning on leaving my libraries unmanaged. In that case, is it better to leave the exe unmanaged as well? Also, I realize that I don't really understand what's happening at the machine level regarding GC. Is there a succinct and accessible introduction that anyone can recommend? – 0x1mason Jan 14 '11 at 15:29
  • @Erasmus777: http://stackoverflow.com/questions/1318631/learning-garbage-collection-theory – Francesco Jan 14 '11 at 15:32
  • If you need a small footprint and tight memory control, pure C++ or C would be the way to go. – Tomas McGuinness Jan 14 '11 at 15:33
3

Everyone thinks about garbage collection the wrong way. Unless there is memory pressure on the system, the CLR has little reason to perform garbage collection passes all the time. So it won't do so. That doesn't mean the memory is effectively leaked, just that the garbage collector hasn't quite gotten around to clearing it out.

In general, benchmarking any garbage collected language in terms of how much memory it is currently using is not a meaningful comparison, because it depends entirely on the behavior of the garbage collector, which is neither deterministic nor under your control.

Billy ONeal
  • 104,103
  • 58
  • 317
  • 552
2

Garbage Collection and no dangling references, I guess. And reflection.

fredoverflow
  • 256,549
  • 94
  • 388
  • 662
  • I guess it seems like the .NET bloat replaces the garbage that it's supposed to eliminate. Or am I misunderstanding other aspects of GC? – 0x1mason Jan 14 '11 at 15:20
  • 4
    @Erasmus: you are misunderstanding what is meant by garbage collection. It's not about reducing memory footprint. It's about managing resources, specifically not having to delete objects dynamically allocated on the head. Read about RAII (Resource Acquisition Is Initialization) and smart pointers to understand the non managed approach to the problem. – Francesco Jan 14 '11 at 15:27
  • Thanks for the recommendation, Francesco. I think I understand what GC does. Perhaps I didn't explain a step in my thinking--namely that poorly managed resources restrict memory reallocation, but .NET is helping reallocation by using memory to do so. In other words, I don't see where the net gain is (pun intended) except in preemption of leaks. Maybe I'm still misunderstanding something, but that's more or less the nature of the problem that puzzled me. – 0x1mason Jan 14 '11 at 15:39
1

Are you looking at memory usage in Task Manager? This isn't an accurate way to profile your app: there will be a lot of irrelevant data bundled into that figure.

For a managed app, I'd expect to see an increase due to having the runtime loaded at all. Furthermore, the .NET garbage collector isn't aggressive about reclaiming memory: it tries to delay collections until it's about to run out of memory.

A garbage-collected app tends to allocate memory better than an unmanaged one. Unmanaged memory is liable to fragmentation: certain patterns of allocation leave holes throughout memory, and although there's enough total free memory for a new allocation, the new allocation can be bigger than the largest hole.

The garbage collector avoids fragmentation by compacting memory each time it performs an allocation: it moves objects next to each other in memory to leave a large hole at the end. Subsequent allocations can be quick because of this: the managed memory allocator doesn't need to look through a list of holes, but can go straight to the end of the heap. [1]

Don't worry about memory usage unless you're allocating a lot of memory.

[1] This doesn't apply to the heap used for large objects (bigger than about 85KB). The large object heap works like a regular malloc/free heap.

Tim Robinson
  • 53,480
  • 10
  • 121
  • 138
0

I am a newbie to DotNet programming too but I have done C/C++ programming for a long time. From what I understand, you dont use DotNet if you are concerned about memory useage. Any platform that has an Intermediate layer like Java or C# will have a higher memory useage. The tradeoff as compared to plain c/C++ programming as explained by others is not just garbage collection and memory management but also richer libraries. I would have taken at least three times the amount of time if I had to do what I am doing now without DotNet. Not to memtion more errors and handling them. Also a Garbage collector that clears memory when it doesn't really need to takes up CPU time which could have been alloted to your process instead and makes your process wait. So the extra memory overhead isnt really a bad thing.

DPD
  • 1,734
  • 2
  • 19
  • 26
0

Your memory footprint is so small I think that probably explains your counter intuitive results. There's overhead in both systems. They running a similar test with about 1 GB of memory and you should see what you thought would happen.

Glen P
  • 719
  • 5
  • 10