I'm doing work involving benchmarking algorithms.
I read about the new <chrono>
header in C++11, so I went with that.
I can take measurements and everything, but I am struggling with resolution.
When doing something like
auto duration = chrono::duration_cast<chrono::nanoseconds>(end_time - start_time).count();
I consistently get times that are multiples of 1000!
Upon investigating further and doing the following
cout << (double) chrono::high_resolution_clock::period::num /
chrono::high_resolution_clock::period::den << endl;
I got a value of 1e-06
which is microseconds, not nanoseconds. It casts to nanoseconds fine, but it's useless if if the period of the clock is only in microseconds to start with.
Am I being pedantic? I know I can run my test code multiple times and get a nice large average time to work with, and that is what I'm doing. But it's almost a matter of principle for me.
Extra Info: I use the latest version of GCC (4.6.3) on Ubuntu 12.04 server X64 (I think)