I am trying to measure how long a function takes.
I have a little issue: although I am trying to be precise, and use floating points, every time I print my code using %lf
I get one of two answers: 1.000
... or 0.000
... This leads me to wonder if my code is correct:
#define BILLION 1000000000L;
// Calculate time taken by a request
struct timespec requestStart, requestEnd;
clock_gettime(CLOCK_REALTIME, &requestStart);
function_call();
clock_gettime(CLOCK_REALTIME, &requestEnd);
// Calculate time it took
double accum = ( requestEnd.tv_sec - requestStart.tv_sec )
+ ( requestEnd.tv_nsec - requestStart.tv_nsec )
/ BILLION;
printf( "%lf\n", accum );
Most of this code has not been made by me. This example page had code illustrating the use of clock_gettime
:
Could anyone please let me know what is incorrect, or why I am only getting int
values please?