0

tl;dr: std::common_type_t<int&, int&> is int. What can we do to have int& instead?

The reason for decaying behavior is explained in this question. The problem with such implementation is stripping away totally legitimate references (that is, the ones passed in by the user).

Indeed, we could write another metafunction going through the source types, figuring out their constness and "referenceness" and appropriately modifying the result of std::common_type, but that seems just ugly. Can we do better?


For the curious ones, the motivation: I'm writing (yet another) visitor wrapper taking a set of lambdas and producing something that works with boost::variant (or std::variant for that matter). I want it to satisfy two requirements, among others:

  1. Allow different lambdas return different but compatible types (like Foo* and Bar* for struct Bar : Foo).

  2. Allow returning non-const lvalue references from the lambdas.

To satisfy the former, I need to figure out the common type of the return values. To satisfy the latter, I need the common type to be a reference if all the original return types are references.

cpplearner
  • 13,776
  • 2
  • 47
  • 72
0xd34df00d
  • 1,496
  • 1
  • 8
  • 17
  • The link you reference at the top shows an implementation of `std::common_type` based on applying `decltype` to a ternary operator. That seems pretty easy to do and if it works, it avoids the `decay`. The article makes it sound like `common_type` does the decay solely for implementation ease. I'm going to guess that there are issues with trying to get this right for references in all situations so the intent is for `common_type` to only be used in situations where a non-reference type is the right answer. Given this, it might be better to write something that gets your use case exactly right. – Zalman Stern Dec 09 '17 at 22:01
  • I wonder if `int` for two `int&`s is ever the right answer. And, moreover, it's quite easy to get an `int` out of `int&` if the latter was returned by `common_type_t`, but, once the information is lost, the inverse is way harder. – 0xd34df00d Dec 09 '17 at 22:06
  • I don't have time to track down the whole history, but as a matter of your own productivity, I'd abandon the idea that the standard is wrong and there must be some way to fix that easily. I did give a pointer to one implementation approach that may be easy for you to implement. Part of the reason I suggest having your own implementation is I expect you're going to run into issues with exactly what you want the type inference to do as you go down this path. – Zalman Stern Dec 09 '17 at 22:20
  • If I understood your reply correctly, you're suggesting using something along the lines of `using T = decltype(true ? std::declval() : std::declval());`. The problem is that it adds unnecessary rvalue ref (a quick [test](https://wandbox.org/permlink/UglKU9OglTPVz7ue) indeed shows that). – 0xd34df00d Dec 09 '17 at 22:27
  • @0xd34df00d you can also use your own version of `std::declval` which would not add references. – Quentin Dec 26 '18 at 16:20

0 Answers0