I am running into this weird issue when trying to compare the performance of data type 'int' and 'long', basically I have two unit tests:
@Test
public void testLongOperationPerformance(){
StopWatch sw = new StopWatch();
sw.start();
long count = 0l;
for(int i = 0; i < Integer.MAX_VALUE; i ++){
count++;
}
sw.stop();
System.out.println(count);
System.out.println(sw.elaspedTimeInMilliSeconds());
}
@Test
public void testIntegerOperationPerformance(){
StopWatch sw = new StopWatch();
sw.start();
int count = 0;
for(int i = 0; i < Integer.MAX_VALUE; i ++){
count++;
}
sw.stop();
System.out.println(count);
System.out.println(sw.elaspedTimeInMilliSeconds());
}
These two unit tests are doing the same thing, difference is one use int as the data type for counter and the other uses long for that. The result:
jdk6u32 (64 bit):
test with long
2147483635
96
test with int
2147483647
2
jdk7 (64 bit)
test with long
2147483647
1599
test with int
2147483647
1632
I noticed:
- in jdk6u32, test with int is much faster than test with long
- in jdk6u32, test results are different between test with int and test with long
- in jdk7, both tests are about the same fast and they're both much slower than jdk6u32
- in jdk7, both tests got the same result
Can anyone explain why it is like that?