The Boost Asio basic_waitable_timer
class (also included in the C++ Networking TS), takes a template parameter WaitTraits
, which allows for custom behavior in terms of converting duration
and time_point
values into wait durations.
Specifically, a WaitTraits
class is expected to have two overloads of to_wait_duration
. The overload that takes a time_point
is expected to convert it to a duration
value that makes sense for a wait duration value that may be passed to an OS wait/sleep function.
The default implementation, wait_traits<Clock>
, is implemented here:
https://github.com/boostorg/asio/blob/develop/include/boost/asio/wait_traits.hpp
However, looking at the actual source code of the default implementation, I can't make any sense of what to_wait_duration
is actually doing. Here is the code:
static typename Clock::duration to_wait_duration(
const typename Clock::time_point& t)
{
typename Clock::time_point now = Clock::now();
if (now + (Clock::duration::max)() < t)
return (Clock::duration::max)();
if (now + (Clock::duration::min)() > t)
return (Clock::duration::min)();
return t - now;
}
I get that the idea here is to take a time_point
and convert it to a duration
value that will likely be passed to some OS wait function. So the obvious way to do that would be to just return t - Clock::now()
. Of course, you may also want to do things like check for overflow.
But what this implementation seems to do makes no sense to me. Firstly, wouldn't the expression now + Clock::duration::max()
always cause a signed integer overflow - thus triggering undefined behavior?? (Assuming that Clock::duration::rep
is a signed integral type, which it would be for standard chrono::duration
types). Secondly, I can't tell what this function is trying to achieve. Why does it ever return Clock::duration::min()
, which would usually be a negative value?
I would think that a to_wait_duration
function would simply do something like check if t > Clock::now()
, and if not, just return Clock::duration::zero()
(since a negative wait duration never makes sense). Basically, something like:
static typename Clock::duration to_wait_duration(const typename Clock::time_point& t)
{
const typename Clock::time_point now = Clock::now();
return (t > now) ? (t - now) : Clock::duration::zero();
}
But... what the actual Boost source code is doing is very confusing to me. What does the Boost implementation of default wait_traits
actually do?