0

So I just learned about leap seconds and at first I thought "oh, well just use the unix timestamp" then I read they based the thing off of a day being a specific amount of seconds and adjust leap seconds to keep it aligned with the sun. Da heck?!

So I guess it's no good. Is there a time format that is strictly based on the number of seconds since January first midnight 1970, or some similar anchor point in time, that doesn't try and sync up with the erratic rotation of our planet?

EDIT: motivation: Having time that is accurate regardless of time of day or timezone and that doesn't possible have an ambiguous time. Frankly I'm surprised to learn this isn't the defacto, it sounds like whoever decided adding in leap seconds, didn't consider that they were unnecessarily adding niche bugs to software.

Sophie McCarrell
  • 2,831
  • 8
  • 31
  • 64
  • 1
    Yes, there is, but you might be better off explaining your motivation. – Josh Lee May 03 '19 at 14:24
  • Having time that is accurate regardless of time of day or timezone and that doesn't possible have an ambiguous time. I'll edit my question with the motive. – Sophie McCarrell May 03 '19 at 16:36
  • Here is a nice Wikipedia article on the various time measurement standards: https://en.wikipedia.org/wiki/Time_standard – Howard Hinnant May 03 '19 at 16:59
  • 1
    Also, time zones and leap seconds are different concepts. – Josh Lee May 03 '19 at 17:15
  • @JasonMcCarrell - That's not what we mean by motive/context. Rather - are you developing an app that will display a date and time of day to an end-user? Or are you measuring the start/stop times of some event to calculate duration? Or are you doing something specialized like scientific timing applications or astronomy? Or something else? – Matt Johnson-Pint May 03 '19 at 20:38
  • As far as the "frankly I'm surprised..." comment, Understand that leap seconds were established in 1972 - well before this was an issue for everyday computing. [Wikipedia's article on the leap seconds](https://en.wikipedia.org/wiki/Leap_second) has a great overview of the history, as well as proposals to eliminate them - where problems in computing is cited as one reason. – Matt Johnson-Pint May 03 '19 at 20:45

2 Answers2

1

The closest ones to my knowledge are astronomy clocks such as


I feel like I should add this. These clocks are rarely useful for most daily needs. The reason is that human activity is heavily depends on sunrise-sunset cycle (even Daylight Saving cycle which has been living on long pasted its usefulness) and beyond that, most typical user thinks of "point in time" in term of "year, month, day, hour,..." and not "where is this relative to a fixed point".

So there maybe a better answer that fits your needs if you add some context to your question.

Binh Tran
  • 133
  • 3
  • 8
  • Thanks for the resources Binh. Regardless of the storage method, they'd all be converted to a human readable format at some point. The difference of storage would simply mean the difference of seconds in niche situations, in which case I imagine a time excluding leap seconds would be preferable to remove inconsistencies (for when and how leap seconds are processed into the time) and ambiguities (like having the same unix timestamp refer to two different points in time). – Sophie McCarrell May 03 '19 at 16:40
  • Oh, wouldn't TAI be the most accurate measure of "number of seconds since January first midnight 1970"? https://en.wikipedia.org/wiki/International_Atomic_Time – Sophie McCarrell May 03 '19 at 17:03
1

You're asking about time standards. In this area, it's important to understand context about how the standard is intended to be used.

Taking your question directly, the time standard you're describing is International Atomic Time (TAI). It is the standard that all of the world's timekeeping institutions (NIST, NPL, etc.) use to coordinate their clocks. It is super useful for this purpose, as this is what it was designed for. It does not have leap seconds.

However, TAI is rarely used directly in computing or business. The only time standard that matters for most of us is Coordinated Universal Time (UTC). This is what your computer's system time tracks, and we typically synchronize to this standard using NTP. It does have leap seconds.

The PTP protocol can be used to synchronize a computer's clock to either TAI or UTC. This is because TAI is the basis, but the protocol separately carries the current offset between UTC and TAI.

Most systems don't display leap seconds. Instead, they are typically absorbed during synchronization. In other words, after a leap second occurs, a system might deviate from UTC by one second until its next synchronization. However, this behavior can vary across systems and platforms. Some may choose to display it directly. Some may "smear" out the leap second across the day.

Unix Timestamps are interesting in that they both do and do not have leap seconds, depending on how you interpret the problem. They do have leap seconds in that the timestamp is aligned to UTC, so any timestamp you interpret is inclusive of all the leap seconds that occurred to date. But they don't have leap seconds in that there is no accounting for them in the calculation. The calculation is an exact number of seconds since the Unix epoch (1970-01-01T00:00:00Z). In other words, the Unix timestamp of a leap second itself (such as 2016-12-31T23:59:60Z) is ambiguous.

Ultimately, in day-to-day computing, you probably shouldn't choose a standard other than UTC or be concerned about ambiguity of Unix Timestamp leap seconds unless you have very specific use cases where leap seconds become an issue.

Matt Johnson-Pint
  • 230,703
  • 74
  • 448
  • 575