I am developing a tile base game. Most of the time the game "respects" the cap of 60fps, however when I intentionally set the fps to 10fps for 5 seconds (via debugging) to test that the game render and game logic work separated, what happens is that after I set back the game fps to 60fps, for a few seconds the game ignores the "cap" and runs at like 10000fps, then it goes back to the cap. The reason of all of this, is that I want that my game runs at variable fps but constant logic speed on all machines, however what I dont understand is why my game respect the cap most of the time and never goes higher than 60fps but when a slowdown or lag spike happens the game runs super faster for 1-2 seconds and then goes back to normal. No matter if the user is running a bad computer and the game runs at 5fps, what I want is ensure the game never goes higher than my hard cap.
Did I need to implement a "skipped frames" system or something? Just for info, my game logic is very light, but the Rendering is pretty heavy.
new Thread(){
public void run(){
long hintPoint = System.currentTimeMillis();
long nexttickG = System.currentTimeMillis();
long nexttickL = System.currentTimeMillis();
long nexttickV = System.currentTimeMillis();
long ticktimeV = 1000;
long LogicCap = 60;
long ticktimeL = 1000/LogicCap;
short frames = 60;
while(isRunning){
if(Main.servbot){
try {
Thread.sleep(15);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
GLOBAL_FPS_CAP=(short)Utils.Clamp(GLOBAL_FPS_CAP,1, 333);
short FramesCap = GLOBAL_FPS_CAP;
long ticktimeG = 1000/FramesCap;
if (System.currentTimeMillis()>nexttickG){
Render();
long elapsed = System.currentTimeMillis() - hintPoint;
frames=(short)((long)1000/(Math.max(elapsed,1)));
hintPoint = System.currentTimeMillis();
nexttickG+=ticktimeG;
}
if (System.currentTimeMillis()>nexttickL){
GameLoop();
nexttickL+=ticktimeL;
}
if (System.currentTimeMillis()>nexttickV){
GLOBAL_FPS_HINT = frames;
nexttickV+=ticktimeV;
}
Thread.yield();
}
}
}.start();
}