I'm hoping that your colleague has written it for another reason, that being that it stops any implicit conversions being made at the function calling site. It also prevents the parameter from being modified in the function body, which can help in the attainment of program stability, but that can be achieved by passing by const
value.
Sometimes the overload resolution mechanism can be harmful, especially when introducing new overloads.
If it's there as a micro-optimisation strategy then that's a silly thing to do. If you can micro-optimise then so can the compiler. And writing typedef long double Real;
is sillier still as (i) you've exchanged a standard term for a proprietary one and, (ii) Real
implies there are no gaps in the representable numbers which we know is not the case. If a compiler with optimisations turned up to maximum produces code that runs faster if the long double
s are passed by reference, then consider switching your toolchain.
(I have worked with a 128 bit long double
and always pass it by value.)