I've seen this code several times.
long lastTime = System.nanoTime();
final double ticks = 60D;
double ns = 1000000000 / ticks;
double delta = 0;
The code above takes the System time and stores it to lastTime
. The 60 ticks should equate to the number of times to update per second.
while(running){
long now = System.nanoTime();
delta += (now - lastTime) / ns;
lastTime = now;
if(delta >= 1){
tick();
delta--;
}
It takes now
and subtracts lastTime
, then converts it to nanoseconds/60. Is there some guarantee that the difference in time between now
and lastTime
to nano over 60 will cause delta to be greater than or equal to 1, 60 times per second? I can't understand why tick();
will run around 60 times per second. From my calculation every time the loop runs delta increases by 0.0025 or so.