I experience a (for me) strange runtimebehaviour in the following code:
public class Main{
private final static long ROUNDS = 1000000;
private final static double INITIAL_NUMBER = 0.45781929d;
private final static double DIFFERENCE = 0.1250120303d;
public static void main(String[] args){
doSomething();
doSomething();
doSomething();
}
private static void doSomething(){
long begin, end;
double numberToConvert, difference;
numberToConvert = INITIAL_NUMBER;
difference = DIFFERENCE;
begin = System.currentTimeMillis();
for(long i=0; i<ROUNDS; i++){
String s = "" + numberToConvert;
if(i % 2 == 0){
numberToConvert += difference;
}
else{
numberToConvert -= difference;
}
}
end = System.currentTimeMillis();
System.out.println("String appending conversion took " + (end - begin) + "ms.");
}
}
I would expect the program to print out similiar runtimes each time. However, the output I get is always like this:
String appending conversion took 473ms.
String appending conversion took 362ms.
String appending conversion took 341ms.
The first call is about 30% slower than the calls afterwards. Most of the time, the second call is also slightly slower than the third call.
java/javac versions:
javac 1.7.0_09 java version "1.7.0_09" OpenJDK Runtime Environment (IcedTea7 2.3.3) (7u9-2.3.3-0ubuntu1~12.04.1) OpenJDK 64-Bit Server VM (build 23.2-b09, mixed mode)
So, my question: Why does this happen?