2

I have two Virtual PCs running Windows Server 2003 and CentOS 5.4 on a host machine. I use a java server on both of them. I copied the java server's files from the host machine on to both of the virtualizations. When both servers are idle (No users online) the Windows server uses 0-4% of its dedicated core while the CentOS server is running 5-15% of its dedicated core. Both copies of the java server seem to be under the same load at this time. Both servers are using Sun JDK 1.7.

I started to test this after finding that running the server on my Window Server 2003 (dedicated) server runs the java server much better than one of my CentOS VPSs. I started to test this to try and rule out that it was a virtualization issue. (My dedicated seems to have the same CPU usage as the virtual Windows host)

Is there any reason why the CentOS server would be using more CPU usage than windows with around the same amount of work?

Zac Fryle
  • 21
  • 1

1 Answers1

0

There can be various factors around what may be installed on type of server OS versus another as well as how they may handle similar tasks. There are too many variables which can factor in as the comparison is not a 1-to-1 or apples-to-apples comparison between two drastically different operating systems.

user48838
  • 7,431
  • 2
  • 18
  • 14
  • Under stress, The Windows server seems to perform 2-3 times better than the newly installed CentOS server. Is this normal? Is there anything that can be done about this? (The VPS is running quite bad and is the only server that I can rely on having high uptime) – Zac Fryle Sep 18 '11 at 02:06
  • You might try determining if there are any processes and/or applications on the CentOS platform which can be be disabled. The approach may recover some additional processing. Updates to the needed processes and/or supporting applications may also yield some additional efficiencies. – user48838 Sep 18 '11 at 04:56