I have two Virtual PCs running Windows Server 2003 and CentOS 5.4 on a host machine. I use a java server on both of them. I copied the java server's files from the host machine on to both of the virtualizations. When both servers are idle (No users online) the Windows server uses 0-4% of its dedicated core while the CentOS server is running 5-15% of its dedicated core. Both copies of the java server seem to be under the same load at this time. Both servers are using Sun JDK 1.7.
I started to test this after finding that running the server on my Window Server 2003 (dedicated) server runs the java server much better than one of my CentOS VPSs. I started to test this to try and rule out that it was a virtualization issue. (My dedicated seems to have the same CPU usage as the virtual Windows host)
Is there any reason why the CentOS server would be using more CPU usage than windows with around the same amount of work?