12
  Debug.WriteLine("Timer is high-resolution: {0}", Stopwatch.IsHighResolution);
  Debug.WriteLine("Timer frequency: {0}", Stopwatch.Frequency);

Result:

  Timer is high-resolution: True
  Timer frequency: 2597705

This article (from 2005!) mentions a Frequency of 3579545, a million more than mine. This blog post mentions a Frequency of 3,325,040,000, which is insane.

Why is my Frequency so much comparatively lower? I'm on an i7 920 machine, so shouldn't it be faster?

xofz
  • 5,600
  • 6
  • 45
  • 63
  • 1
    Is it a laptop? Often times the clock rate of a machine is scaled depending on the battery/power status. – Nick Feb 26 '10 at 23:58
  • It's a desktop. I have power settings set to High Performance mode; the processor clock rate is at default (2.66 GHz). – xofz Feb 27 '10 at 00:01

4 Answers4

27

3,579,545 is the magic number. That's the frequency in Hertz before dividing it by 3 and feeding it into the 8053 timer chip in the original IBM PC. The odd looking number wasn't chosen by accident, it is the frequency of the color burst signal in the NTSC TV system used in the US and Japan. The IBM engineers were looking for a cheap crystal to implement the oscillator, nothing was cheaper than the one used in every TV set.

Once IBM clones became widely available, it was still important for their designers to choose the same frequency. A lot of MS-DOS software relied on the timer ticking at that rate. Directly addressing the chip was a common crime.

That changed once Windows came around. A version of Windows 2 was the first one to virtualize the timer chip. In other words, software wasn't allowed to directly address the timer chip anymore. The processor was configured to run in protected mode and intercepted the attempt to use the I/O instruction. Running kernel code instead, allowing the return value of the instruction to be faked. It was now possible to have multiple programs using the timer without them stepping on each other's toes. An important first step to break the dependency on how the hardware is actually implemented.

The Win32 API (Windows NT 3.1 and Windows 95) formalized access to the timer with an API, QueryPerformanceCounter() and QueryPerformanceFrequency(). A kernel level component, the Hardware Adaption Layer, allows the BIOS to pass that frequency. Now it was possible for the hardware designers to really drop the dependency on the exact frequency. That took a long time btw, around 2000 the vast majority of machines still had the legacy rate.

But the never-ending quest to cut costs in PC design put an end to that. Nowadays, the hardware designer just picks any frequency that happens to be readily available in the chipset. 3,325,040,000 would be such a number, it is most probably the CPU clock rate. High frequencies like that are common in cheap designs, especially the ones that have an AMD core. Your number is pretty unusual, some odds that your machine wasn't cheap. And that the timer is a lot more accurate, CPU clocks have typical electronic component tolerances.

Hans Passant
  • 922,412
  • 146
  • 1,693
  • 2,536
  • 1
    nobugz, thank you for the detailed answer. Could you expand on your last paragraph? It seems to me that a higher frequency would grant increased precision: with a timer freq. of 3.3ghz, I'm at .3-nanosecond resolution, whereas with my freq., I'm at 385ns. – xofz Feb 27 '10 at 02:34
  • 4
    Well, you've got a lot less resolution. But your timer is probably a lot more accurate. The 3.3 GHz CPU clock rate is typically only accurate by 10%. I don't know for a fact, it depends how the signal gets generated. Anything running at one megahertz or better is plenty good enough for timing software, the jitter due to threading is a lot worse than that. – Hans Passant Feb 27 '10 at 09:35
  • 1
    @SamPearson So possibly the guy from the second post, with 3,325,040,000 ticks per second, had `Stopwatch.IsHighResolution` set to False, since his system used the CPU "clock" directly, leading to ±10% in precision according to Hans. If this is what is meant by `IsHighResolution`, I think the choice of term "resolution" is unfortunate, since clearly 3,325,040,000 counts per second is a higher "resolution" than 2,597,705 counts per second. Resolution is something else than accuracy or uncertaincy. – Jeppe Stig Nielsen Mar 03 '14 at 15:43
  • I can imagine two types of inaccuracy. (1) The stated frequency could be imprecise, compared to the true SI second (atomic time), so that over time, the `Stopwatch` runs consistently too slow or consistently too fast. (2) The actual realized frequency could fluctuate, depending on the state of the entire system, or even the state of the power supply, the temperature etc., leading to a non-uniform `Stopwatch` (somtimes slower than other times). The problem (1) is bad if you want to compete with real stopwatches. The problem (2) is worst if you use `Stopwatch` for comparing performances. – Jeppe Stig Nielsen Mar 03 '14 at 15:51
  • 1
    @JeppeStigNielsen That would be a misunderstanding of his explanation, if the IsHighResolution is set to false, the frequency would be equal to the "DateTime" ticks which is 10.000.000 / sec. Unfortunately it isn't updated each at each tick and can be as slow as an update at every ~150.000 ticks (maybe even less frequent?) – Jens Nov 14 '16 at 06:49
7

The frequence depends on the HAL (Hardware abstraction layer). Back in the pentium days, it was common to use the CPU tick (which was based on the CPU clock rate) so you ended up with really high frequency timers.

With multi-processor and multi-core machines, and especially with variable rate CPUs (the CPU clock slows down for low power states) using the CPU tick as the timer becomes difficult and error prone, so the writers of the HAL seem to have chosen to use a slower, but more reliable hardware clock, like the real time clock.

John Knoeller
  • 33,512
  • 4
  • 61
  • 92
1

The Stopwatch.Frequency value is per second, so your frequency of 2,597,705 means you have more than 2.5 million ticks per second. Exactly how much precision do you need?

As for the variations in frequency, that is a hardware-dependent thing. Some of the most common hardware differences are the number of cores, the frequency of each core, the current power state of your cpu (or cores), whether you have enabled the OS to dynamically adjust the cpu frequency, etc. Your frequency will not always be the same, and depending on what state your cpu is in when you check it, it may be lower or higher, but generally around the same (for you, probably around 2.5 million.)

jrista
  • 32,447
  • 15
  • 90
  • 130
0

I think 2,597,705 = your processor frequency. Myne is 2,737,822. i7 930

Pedro77
  • 5,176
  • 7
  • 61
  • 91
  • 2
    Hardly. 2.6 MHz would be a pretty slow processor frequency these days ;) – Christoph Rüegg Sep 30 '13 at 21:32
  • Yeap, you are right! Sorry for that. The guy did something wrong. I think the Stopwatch don't work very well in debug mode. – Pedro77 Oct 01 '13 at 12:12
  • :) . Stopwatch & debug ... so many articles I'd want to point back to ... it comes down to -> performance measuring is **highly** tricky. Just for fun : My frequency: 3,320,390 , I7 2600k. – Noctis Jul 29 '14 at 22:46