1

In my system I have a PC (Linux, in case it matters) which keeps RTC time in UTC, making my localtime timezone specific. In PC code, I get UTC time as seconds since epoch using

struct timespec tv; 
clock_gettime(CLOCK_REALTIME, &tv);
double time = (tv.tv_nsec / 1000000000.0) + tv.tv_sec;
return time;

I also have a 3rd party network device which provides its time also as seconds from epoch, but it does so using localtime instead of UTC time. This is a problem because, when I print the two timestamps in an interleaved log with timestamps from PC and this device, even though the two clocks show the same localtime, the timestamps are off.

Let's assume that the timezone settings (UTC offset and daylight savings specifications) are the same between the PC and this device. How would I take the seconds since epoch provided by the device (in localtime) and convert it to seconds since epoch in UTC? In other words, what the programmatic (in C) way to apply PC timezone settings to a seconds since epoch when that number is in localtime?

Here is my attempt at converting the 3rd party device localtime based seconds since epoch to UTC based seconds since epoch.

#include <stdio.h>
#include <time.h>

int main(void)
{
  // The following epoch timestamps were converted to human time via https://www.epochconverter.com/
  time_t device_rawtime = 1568133906.065000; // if treated as GMT:       Tuesday, September 10, 2019 4:45:06.065 PM
  time_t pc_rawtime     = 1568151907.454432; // if treated as localtime: Tuesday, September 10, 2019 4:45:07.454 PM GMT-05:00 DST
  struct tm  ts; 
  char       buf[80];

  ts = *gmtime(&device_rawtime);
  strftime(buf, sizeof(buf), "%a %Y-%m-%d %H:%M:%S %Z", &ts);
  time_t converted = mktime(&ts);
  printf("Device rawtime=%ld which is PC localtime %s ==> UTC based rawtime=%ld (pc was %ld)\n", device_rawtime, buf, converted, pc_rawtime);
  return 0;
}

The above does not work. It prints

Device rawtime=1568133906 which is PC localtime Tue 2019-09-10 16:45:06 GMT ==> UTC based rawtime=1568155506 (pc was 1568151907)

As you can see, the converted device timestamp does not equal PC timestamp. How should this be done?

Paul Grinberg
  • 1,184
  • 14
  • 37
  • The difference (1568155506 - 1568151907) is quite exactly 1 hour. Perhaps daylight saving is not taken into account? – the busybee Sep 11 '19 at 07:18
  • @thebusybee - I agree that it's likely due to daylight savings. But the question is how should I be accounting for that? – Paul Grinberg Sep 11 '19 at 13:15
  • _I also have a 3rd party network device which provides its time also as seconds from epoch, but it does so using localtime instead of UTC time._ This statement doesn't make sense; the Epoch is 1970-01-01 00:00:00 +0000 (UTC) - for a certain point in time, there is only one number of seconds elapsed since the Epoch, and the time zone or localtime cannot change that number. What do you actually mean? At which point in time did the device report `1568133906.065000`? Is the device actually aware of a timezone? In what timezone is it located? How was the device's time set? – Armali Sep 12 '19 at 13:55
  • 1
    @Armali - Since this is a 3rd party device, I can only infer how it treats time. From what I can tell, it keeps its RTC in localtime. It does not have a notion of a timezone. When I use a 3rd party tool to set the time on this device, the time is sent to the device as broken-down (aka year, month, day, hour, min, sec) localtime from my machine and the device simply takes it and starts ticking its clock from whatever value it received. Then, if I ask the device for time, it gives me a number which only makes sense as a seconds since epoch, but clearly it's in localtime and not UTC. – Paul Grinberg Sep 12 '19 at 14:00
  • So, if I understand correctly, if you set the time by sending 2019, 9, 10, 16, 45, 6 and asked for the time shortly thereafter, the device would give a number just a little higher than 1568133906? If so, then I wouldn't say that the number is _in localtime_ - you just view the broken-down time you provide as localtime, but on no basis. Wouldn't all be easier if you think of the device as using UTC, and setting its time with UTC values? – Armali Sep 12 '19 at 15:36
  • 1
    @Armali - Your understanding of how time is set into and reported from this device is correct. However, I cannot set the device time with a broken-down UTC timestamp because whatever value the device gets it displays to the user of the device (outside of my PC) as localtime. In other words, users of this device expect the time displayed by the device to match their hypothetical watch and I want that to match the localtime on my PC – Paul Grinberg Sep 12 '19 at 16:06

1 Answers1

1

I agree that it's likely due to daylight savings. But the question is how should I be accounting for that?

The relevant information is found in man mktime:

The value specified in the tm_isdst field informs mktime() whether or not daylight saving time (DST) is in effect for the time supplied in the tm structure: a positive value means DST is in effect; zero means that DST is not in effect; and a negative value means that mktime() should (use timezone information and system databases to) attempt to determine whether DST is in effect at the specified time.

On return from gmtime(&device_rawtime), ts.tm_isdst is set to zero, since UTC taken by gmtime() is never daylight saving. So, when mktime(&ts) is called, it converts the time structure with the information that DST is not in effect, thus we get a converted time value which is 3600 seconds too high. To correctly account for DST, setting

ts.tm_isdst = -1

before calling mktime(&ts) is sufficient.

Armali
  • 18,255
  • 14
  • 57
  • 171