1

I'm attempting to monitor system time elapsed across multiple applications and data paths via gettimeoday and localtime. In this example, I want to grab the system time (microsecond percision) right before my code runs. I'm currently doing it like this:

#include <stdio.h>
#include <sys/time.h>

struct timeval tv;
struct timezone tz;
struct tm *tm;

**code I don't care about**

gettimeofday(&tv, NULL);
tm=localtime(&tv.tv_sec);

**code to watch**    

printf(" %d:%02d:%02d %ld \n", tm->tm_hour, tm->tm_min,
            tm->tm_sec, tv.tv_usec);

From what I understand the localtime call definitely incurs a bit of overhead and can throw off accuracy. I might be totally wrong on this, but should I wait to call localtime until after my code to watch completes? I'm assuming that localtime is just doing a semi-expensive conversion of gettimeofday's results, and thus it should be placed right before the printf statement.

Seanny123
  • 8,776
  • 13
  • 68
  • 124
fbd39j
  • 35
  • 2
  • 4
  • You don't need to call `localtime()` at all; `gettimeofday()` is sufficient to monitor elapsed time. – trojanfoe May 05 '11 at 16:09
  • Rather than timing the length of a specific code snippet, I want to accurately grab the system time in microsecond percision. Should I still only need gettimeofday()? – fbd39j May 05 '11 at 16:12
  • Yeah I don't see what the call to `localtime()` adds at all. – trojanfoe May 05 '11 at 18:17

1 Answers1

2

If you really need microsecond accuracy, yes. I'd be very surprised if localtime executed in less than a microsecond. But I'd also be surprised if gettimeofday had a microsecond resolution, and even if the actual timer did (highly unlikely), the context switch when returning from the system will probably take well over a microsecond (and could be longer than the call to localtime). The fact is that except when accessing special hardware directly, you can't get anywhere near microsecond resolution.

James Kanze
  • 150,581
  • 18
  • 184
  • 329
  • There's a difference between accurancy and resolution; `gettimeofday()` has a resolution of microseconds, but will take many milliseconds to execute, as you have stated. – trojanfoe May 05 '11 at 18:17
  • Maybe I should have used the word "granularity". There's a difference between the resolution of the system clock, and the potential resolution of the representation. By resolution, I meant the resolution of the system clock, which is (according to Posix) unspecified. (And I'd be surprised if `gettimeofday()` took more than a couple of hundred microseconds to execute, rather than milliseconds.) – James Kanze May 05 '11 at 19:59