I read on msdn that although timers cannot guarantee to fire at the exact interval (in my case 1 second) they will not fire before the interval.
The timers on one pc is working fine (Windows 7) while on the other (Windows Server 2003) fires every 0.99999936 seconds
.
I'm really interested in why this is happening.
I noticed this because I had code counting seconds to newSeconds = newSeconds + delta.Seconds
Where delta
was DateTime.Now - lastTime
The seconds part was showing 1
on Windows 7 and 0
on Windows Server 2003.
Solution was to just read totalseconds
, but still I wonder why it's firing before.
Can anyone elaborate on this?
Edit
I actually have it happening on two different windows 2003 pcs. My wondering goes deeper into the areas off is there a difference between os's, is the .net framework 4 different for 7 vs 2003? Or any other deviations people might know of? How are the timers implemented, could it be a hardware related issue?
And as oppose to this one: C# timer getting fired before their interval time I have it happening all the time, on every tick. No need for long running. Thanks
Edit
public void OnTick(object sender, EventArgs e)
{
var delta = DateTime.Now - _lastTime;
DoStuff
_lastTime = DateTime.Now
}