6

A 2-parter (if necessary I can seperate questions to award seperate right answers):

We're in a situation where running in server mode is likely appropriate: We have 2 enterprise level apps running on the same farm. If I put the app config Collection Modes of both apps in "Server Mode" am I risking hurting one app every time the other GC's?

Will the Network Load Balancer shift traffic away from a machine in the middle of GC (again in server mode)?

EDIT I broke the Load Balancing part of this question over to here: Is the network load balancer of a web farm affected by GC strain?

Community
  • 1
  • 1
Rikon
  • 2,688
  • 3
  • 22
  • 32

3 Answers3

3

If you are concerned about two enterprise app influencing performance of each other you shall consider moving them into two separate VMs.

GC is optimised and running in it's own thread(s). It is designed to be invisible for the current application. So on a multiprocessor enterprise server, a separate process shall not be damaged at all.

From the other point, server is still getting some load from GC. If you feel that GC somehow slows down your applications, you might probably perform some memory and CPU profiling to see where is a problem. You may find a way to optimize code and use less resources.

From J.Richter "CLR via C# v3" p.585

This mode fine-tunes the garbage collector for server-side applications. The garbage collector assumes that no other applications (client or server) are running on the machine and it assumes that all the CPUs on the machine are available to do a garbage collection. This GC mode causes the managed heap to be split into several sections, one per CPU. When a garbage collection is initiated, the garbage collector has one thread per CPU; each thread collects its own section in parallel with the other threads. Parallel collections work well for server applications in which the worker threads tend to exhibit uniform behavior. This feature requires the application to be running on a computer with multiple CPUs so that the threads can truly be working simultaneously to attain a performance improvement.

oleksii
  • 35,458
  • 16
  • 93
  • 163
  • 1
    Your second paragraph is incorrect. Please note that in Server Mode, garbage collection has a process for each core and the application is essentially paused while collection takes place. – Greg Aug 03 '11 at 18:36
  • To your point, I have no desire to micro manage the GC in either of these apps, but it was my understanding though that putting the GC Mode in "Server mode" basically just gave the CLR liberty to be a bit more ravenous w/ resources, parallelizing out over the cores and such. Is this erroneous? – Rikon Aug 03 '11 at 18:38
  • @Greg, sorry, but it is correct. From [docs](http://msdn.microsoft.com/en-us/library/gg425446.aspx): "The server GC is available only on multiprocessor computers. It creates a separate managed heap and **thread** for each processor and performs collections in parallel". – oleksii Aug 03 '11 at 18:39
  • @oleksii - According to the docs I can find (linked above), in Server Mode the application does not run during garbage collection. Depending on the application's characteristics, this may benefit performance or damage performance. MSDN: "It creates a separate managed heap and thread for each processor and performs collections in parallel. During collection, all managed threads are paused" – Greg Aug 03 '11 at 18:47
  • @oleksii: If you'll repost your comments about the load-balancer over to this question: http://stackoverflow.com/questions/6931831/is-the-network-load-balancer-of-a-web-farm-affected-by-gc-strain I think I'll seperate the argument the way I should have and remove the load-balancing element of it from here... Thanks for your answer! – Rikon Aug 03 '11 at 19:04
  • @Greg I updated my answer, do you think it is still incorrect? Tnx :) – oleksii Aug 04 '11 at 08:23
  • I agree with it all except for "So on a multiprocessor enterprise server, a separate process shall not be damaged at all." I think that bursts of GC activity on all processors could affect other processes. And I agree that profiling the performance of the applications will best answer the OP's question. – Greg Aug 05 '11 at 14:53
1

From MSDN (emphasis added)

Workstation is the default GC mode and the only one available on single-processor computers. Workstation GC is hosted in console and Windows Forms applications. It performs full (generation 2) collections concurrently with the running program, thereby minimizing latency. This mode is useful for client applications, where perceived performance is usually more important than raw throughput.

The server GC is available only on multiprocessor computers. It creates a separate managed heap and thread for each processor and performs collections in parallel. During collection, all managed threads are paused (threads running native code are paused only when the native call returns). In this way, the server GC mode maximizes throughput (the number of requests per second) and improves performance as the number of processors increases. Computers with four or more processors offer enhanced performance. All managed code applications using Lync Server Application API should use the server GC.

In Server Mode, GC runs periodically on all processors in parallel. This could rob other applications of CPU time during collection.

So Server Mode might improve the performance of an application, but it might degrade the performance of other applications running on the same server. It's all very speculative - I think you'll have to do some benchmarking to test the responsiveness and throughput of your applications to know for sure.

Community
  • 1
  • 1
Greg
  • 23,155
  • 11
  • 57
  • 79
0

I have seen runaway memory allocation due to a lazy GC affect seemingly unrelated processes before. However, it does not appear that server mode does that (hog system memory to decrease collection frequency) so you should be fine.

Joshua
  • 40,822
  • 8
  • 72
  • 132