So I've written a rather silly program just to work with nanoTime a bit. I wanted to be able to check execution times of small bits of code so I figure nanoTime would be the best. I wanted to determine the average execution time of this short bit of code, so I put it inside a for loop. However, when inside the for loop, the average drops to about 6,000 nano seconds less. I know this isn't a huge difference on small code but I am curious why it would be any different for the same exact code? here are the two blocks that yield different times: this one is an average of about 8064 nano seconds:
long start, end, totalTime;
double milliseconds, seconds, minutes, hours, days, years;
totalTime = 0;
start = System.nanoTime();
milliseconds = System.currentTimeMillis();
seconds = milliseconds/1000;
minutes = seconds/60;
hours = minutes/60;
days = hours/24;
years = days/365;
end = System.nanoTime();
totalTime = end-start;
and this one is an average of about 2200 nano seconds:
long start, end, totalTime;
double milliseconds, seconds, minutes, hours, days, years;
totalTime = 0;
for(int i = 1; i < 11; i++){
start = System.nanoTime();
milliseconds = System.currentTimeMillis();
seconds = milliseconds/1000;
minutes = seconds/60;
hours = minutes/60;
days = hours/24;
years = days/365;
end = System.nanoTime();
totalTime += end-start;
System.out.println(end-start); //this was added to manually calc. the average to
//make sure the code was executing properly. does not effect execution time.
}
and then to find the average you take totalTime*.1