Pointers in C++ do not have a defined total ordering unless they fall within a narrow set of criteria, such as being all parts of the same subobject or array (expr.rel/4
, defns.order.ptr
).
In fact, to even provide a basic ordering guarantee, you can't use operator<
but instead must use std::less
with pointers from different subobjects -- otherwise the result is not well defined behavior.
From comparisons.general/2
:
For templates
less
,greater
,less_equal
, andgreater_equal
, the specializations for any pointer type yield a result consistent with the implementation-defined strict total order over pointers (defns.order.ptr
).
However, in C++ we also have the (optional) type std::uintptr_t
-- which is defined to be an unsigned
integer value large enough to store a pointer, with the property that its capable of surviving a round-trip from void*
to std::uintptr_t
and back to void*
without any loss-of-data. unsigned
integer types also have a well-defined total-ordering -- which leads me to my question.
Is it at all reasonable to expect that a std::uintptr_t
from different sources have a well-defined ordering, within the definition of the C++ Abstract Machine?
I'm not asking whether this would work in practice, but whether this is even feasible to assume that this is well-defined behavior (my assumption is that it is not)
As a concrete example of a potential application of this, I am interested whether something like the following is formally well-defined behavior:
template <typename T>
auto my_typeid() -> std::uintptr_t
{
// Each 'char' has a unique address since it's static and part of
// each unique template instantiation
static const char s_data = 0;
// Use this address for an ordering system, and for identity
return reinterpret_cast<std::uintptr_t>(&s_data);
}