The C standard (from which C++ inherits the definition of time_t
) says only that time_t
is an arithmetic type capable of representing times. It says nothing about how it does so. In principle, an implementation could define a time_t
value as the number of seconds until some future date, or it could encode month, day, year, hours, minutes, and seconds in that order. Comparing time_t
values is guaranteed to behave consistently (t0 < t1 && t1 < t2
implies t0 < t2
, for example), but the result of a comparison doesn't necessarily say anything about the order of the times that are represented.
To be 100% portable, you can check the result of difftime()
.
But if you're willing to settle for about 99% portability, you should be safe in assuming that comparisons work as expected (that t0 < t1
implies t0
precedes t1
in real time). I've never heard of an implementation where that doesn't work. If it falls, you can complain to the implementers, but you can't say they've failed to confirm to the standard.