This is not the first time I've had problems with clock times in a programming language. Basically I'm measuring how fast a function is running by calling it in a while loop. Problem is that for some reason the elapsed time keeps getting shorter the longer the while loop runs. Can anyone explain? Code below.
DescriptiveStatistics stats = new DescriptiveStatistics();
while(true) {
long startTime = System.nanoTime();
executeSaxonXsltTransformation();
long stopTime = System.nanoTime();
long elapsedTime = stopTime-startTime;
stats.addValue((double)elapsedTime);
System.out.println(stats.getN()+" - "+elapsedTime+ " - "+stats.getMean());
}
So after about 1,000 runs the elapsed time is 750k to 850k. But after about 100,000 runs the elapsed time drops to 580k to 750k. The continual decrease is best noticed by watching the average (stats.getMeans()), which after 108k loops has an average of ~632k compared to 3k loops with an average of ~1million. Switching to currentTimeMillis instead of nanoTime doesn't change anything.