Bear with me, this might be a little difficult to explain clearly. I am trying to understand how to code a program which uses only the amount of CPU it needs. It's a little confusing to explain so I will just use a real example.
I made a Tetris game with an infinite main game loop. I have restricted it to 40 fps. But the loop still executes thousands or even millions of times per second. It just renders when enough time has passed by to restrict it to 40 fps.
Since I have a 4 core CPU, when I run the game, everything is fine and the game runs good. But the CPU usage is held at 25% for the game process. This is expected since it is an infinite loop and keeps running continuously.
I then read online to add a delay for 1 ms to the main loop. This immediately reduced the usage to around 1% or less. This is good, but now I am deliberately waiting 1 ms every loop. It works because my main loop takes a lot less time to execute and the 1ms delay does not affect the game.
But what if I make larger games. Games with longer and more processor intensive loops. What if I need that 1ms time slice for running the game smoothly. Then, if I remove the delay, the processor will jump to 25% again. If I add the delay, the game will be slow and maybe have some lag.
What is the ideal solution in this case ? How are real games/applications coded to prevent this problem ?