1

I've made a cellular automaton (Langton's ant FYI) on VBA. At each step, there is a Sleep(delay) where delay is a variable. I've also added DoEvents at the end of a display function to ensure that each step is shown on screen. With a Timer I can monitor how long one step require in average. The result is plotted on the graph bellow (Y-axis : Time per step (in ms). X-axis : delay (in ms))

Time per step (in ms) vs <code>delay</code>

Could you explain to me why it looks like that? Especially why does it remain steady ? Because IMO, I'm suppose to have (more or less) a a straight line. I got these results whitout doing anything else on my computer during the whole process.

Thank you in advance for your help,

Community
  • 1
  • 1
Ezor
  • 135
  • 2
  • 18
  • 5
    That would be because the `Sleep API` is based off of the system clock. If the resolution of your clock is lower than the time you are sleeping, then it will round up to the nearest resolution of your system clock. You may be able to call `timeGetDevCaps` to see the minimum timer resolution of your system. – K.Dᴀᴠɪs Jan 15 '18 at 15:28
  • 3
    ^ You may want to post that as an answer. Definitely learned something new :). – Brandon Barney Jan 15 '18 at 15:30

1 Answers1

4

That would be because the Sleep API is based off of the system clock. If the resolution of your clock is lower than the time you are sleeping, then it will round up to the nearest resolution of your system clock. You may be able to call timeGetDevCaps to see the minimum timer resolution of your system.

Think about it this way. You have a normal watch that only includes your usual Hour/Minute/Second hand (no hands for 1/1000, etc). You are wanting to time half a second, but your watch only moves in 1 second intervals - hence your watch's resolution is 1 tick per second. You would not know that half a second has actually passed by until the full second passes by due to this resolution, so it's rounded to the next tick.

K.Dᴀᴠɪs
  • 9,945
  • 11
  • 33
  • 43
  • It could explain a lot ! How do you use `timeGetDevCaps` ? Nothing is return even with this : `Private Declare Function timeGetTime Lib "winmm.dll" () As Long ` – Ezor Jan 15 '18 at 16:16
  • 1
    Perhaps this document will be of assistance: [Timers, Timer Resolution, and Development of Efficient Code](http://download.microsoft.com/download/3/0/2/3027D574-C433-412A-A8B6-5E0A75D5B237/Timer-Resolution.docx). "_The default timer resolution on Windows 7 is 15.6 milliseconds (ms)_". Which you can see that you hover just above 15 and 31 ms on your chart, this makes sense. – K.Dᴀᴠɪs Jan 15 '18 at 16:36
  • I believe it is possible to change the default resolution, but I do not recommend doing so because I am not too familiar with any potential side effects. – K.Dᴀᴠɪs Jan 15 '18 at 16:37
  • 1
    The above document also states: "_Applications can call `timeBeginPeriod` to increase the timer resolution_". From the reading it appears you may suffer _"...increase overall system power consumption and subsequently reduce system battery life."_ – K.Dᴀᴠɪs Jan 15 '18 at 16:46
  • As you can imagine, your processor can handle instructions much much faster than this default resolution. A 2 GHz processor can handle 2 billion cycles per second, but of course you wouldn't want to utilize each cycle to update the timer tick count else you wouldn't have any room for other applications. – K.Dᴀᴠɪs Jan 15 '18 at 16:50
  • Of course I don't have to improve the timer resolution to much. The order of magnitude of one ms is enough here. I'll try to find out how to change the resolution when my programm is running, which won't affect the power cosumption to much. Once again, many thanks – Ezor Jan 16 '18 at 08:25