I wrote small program to show local time relative to GMT (or UTC):
struct tm l;
time_t stamp = 1534435540;
// No TZ set
printf("TZ=%s\n",getenv("TZ"));
gmtime_r(&stamp, &l);
printf("UTC: %0u:%0u:%0u - %0u.%0u.%0u\n",l.tm_hour,l.tm_min,l.tm_sec,l.tm_mday,l.tm_mon+1,l.tm_year+1900);
localtime_r(&stamp, &l);
printf("Local: %0u:%0u:%0u - %0u.%0u.%0u\n\n",l.tm_hour,l.tm_min,l.tm_sec,l.tm_mday,l.tm_mon+1,l.tm_year+1900);
// Positive TZ: east to Greenwich (e.g. China)
setenv("TZ", "UTC+6:00", 1);
printf("TZ=%s\n",getenv("TZ"));
tzset();
gmtime_r(&stamp, &l);
printf("UTC: %0u:%0u:%0u - %0u.%0u.%0u\n",l.tm_hour,l.tm_min,l.tm_sec,l.tm_mday,l.tm_mon+1,l.tm_year+1900);
localtime_r(&stamp, &l);
printf("Local: %0u:%0u:%0u - %0u.%0u.%0u\n\n",l.tm_hour,l.tm_min,l.tm_sec,l.tm_mday,l.tm_mon+1,l.tm_year+1900);
// Negative TZ: west to Greenwich (e.g. US/Canada)
setenv("TZ", "UTC-6:00", 1);
printf("TZ=%s\n",getenv("TZ"));
tzset();
gmtime_r(&stamp, &l);
printf("UTC: %0u:%0u:%0u - %0u.%0u.%0u\n",l.tm_hour,l.tm_min,l.tm_sec,l.tm_mday,l.tm_mon+1,l.tm_year+1900);
localtime_r(&stamp, &l);
printf("Local: %0u:%0u:%0u - %0u.%0u.%0u\n\n",l.tm_hour,l.tm_min,l.tm_sec,l.tm_mday,l.tm_mon+1,l.tm_year+1900);
Output of this program is the following:
TZ=<null>
UTC: 16:5:40 - 16.8.2018
Local: 16:5:40 - 16.8.2018
TZ=UTC+6:00
UTC: 16:5:40 - 16.8.2018
Local: 10:5:40 - 16.8.2018
TZ=UTC-6:00
UTC: 16:5:40 - 16.8.2018
Local: 22:5:40 - 16.8.2018
It looks strange, isnt'it? According to Wiki:
For example, if the time being described is one hour ahead of UTC (such as the time in Berlin during the winter), the zone designator would be "+01:00", "+0100", or simply "+01"
So GMT+6 means that I add 6 hours to GMT to get local time.
But this article about TZ variable says the opposite:
This is positive if the local time zone is west of the Prime Meridian and negative if it is east.
Do I miss something or linux behaves opposite to standard?