I cannot really find anything interesting concerning this question, but I've been wondering for quite some time now how timers and delays in any programming language work at low level.
As far as I understand, a CPU continuously executes instructions in all of its cores, as fast as it can (dependent on its clock speed), and as long as there are any instructions that are to be executed (there is a running, active thread).
I don't feel that there is a straightforward way to manipulate this flow where real time is concerned. I then wonder how stuff like animations work, encountered in many, many situations:
- In the Windows 7 OS, the start menu button gradually glows brighter when you move the mouse over it;
- In flash, there is a timeline and all objects in the flash document are animated according to the FPS setting and the timeline;
- jQuery supports various animations;
- Delays in code execution...
Do computers (mainboards) have physical timers? Like a CPU has registers to do its operations and keep data in between calculations? I haven't found anything about that on the internet. Does the OS have some really complex programming that provides the lowest-level API for all things related to timing?
I'm really curious about the answer.