I am interested in accurately timing a c++ application. There seems to be multiple definitions for "time", but for the sake of this question... I am interested in the time that I am counting on my watch in the real world... if that makes any sense! Anyway, in my application, my start time is done like this:
clock_t start = clock();
.
.
. // some statements
.
.
clock_t end = clock();
.
.
double duration = end - start;
.
.
cout << CLOCKS_PER_SEC << endl;
start is equal to 184000 end is equal to 188000 CLOCKS_PER_SEC is equal to 1000000
Does this mean the duration (in seconds) is equal to 4000/1000000 ? If so, this would mean the duration is .004 seconds? Is there a more accurate way of measuring this?
Thank you