I'm trying to measure getrusage resolution via simple program:
#include <cstdio>
#include <sys/time.h>
#include <sys/resource.h>
#include <cassert>
int main(int argc, const char *argv[]) {
struct rusage u = {0};
assert(!getrusage(RUSAGE_SELF, &u));
size_t cnt = 0;
while(true) {
++cnt;
struct rusage uz = {0};
assert(!getrusage(RUSAGE_SELF, &uz));
if(u.ru_utime.tv_sec != uz.ru_utime.tv_sec || u.ru_utime.tv_usec != uz.ru_utime.tv_usec) {
std::printf("u:%ld.%06ld\tuz:%ld.%06ld\tcnt:%ld\n",
u.ru_utime.tv_sec, u.ru_utime.tv_usec,
uz.ru_utime.tv_sec, uz.ru_utime.tv_usec,
cnt);
break;
}
}
}
And when I run it, I usually get output similar to the following:
ema@scv:~/tmp/getrusage$ ./gt u:0.000562 uz:0.000563 cnt:1 ema@scv:~/tmp/getrusage$ ./gt u:0.000553 uz:0.000554 cnt:1 ema@scv:~/tmp/getrusage$ ./gt u:0.000496 uz:0.000497 cnt:1 ema@scv:~/tmp/getrusage$ ./gt u:0.000475 uz:0.000476 cnt:1
Which seems to hint that the resolution of getrusage is around 1 microsecond.
I thought it should be around 1 / getconf CLK_TCK
(i.e. 100hz, hence 10 millisecond).
What is the true getrusage resolution?
Am I doing anything wrong?
Ps. Running this on Ubuntu 20.04, Linux scv 5.13.0-52-generic #59~20.04.1-Ubuntu SMP Thu Jun 16 21:21:28 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux
, 5950x.