In short: Does an un-delayed while loop consume significant processing power, compared to a similar loop which is slowed down by a delay?
In not-so-short:
I have run into this question more often. I am writing the core part of a program (either microcontroller unit or computer application) and it consists of a semi-infinite while loop to stay alive and look for events.
I will take this example: I have a small application that uses an SDL window and the console. In a while loop I would like to listen to events for this SDL window, but I would also like to break this loop according to the command line input by means of a global variable. Possible solution (pseudo-code):
// Global
bool running = true;
// ...
while (running)
{
if (getEvent() == quit)
{
running = false;
}
}
shutdown();
The core while loop will quit from the listened event or something external. However, this loop is run continuously, maybe even a 1000 times per second. That's a little over-kill, I don't need that response time. Therefore I often add a delaying statement:
while (running)
{
if (getEvent() == quit)
{
running = false;
}
delay(50); // Wait 50 milliseconds
}
This limits the refresh rate to 20 times per second, which is plenty.
So. Is there a real difference between the two? Is it significant? Would it be more significant on the microcontroller unit (where processing power is very limited (but nothing else besides the program needs to run...))?