template<typename T, int V>
class AwesomeClass // [1]
{
public:
AwesomeClass(T baseValue)
: m_value(baseValue + V)
{
}
T get() const
{
return m_value;
}
private:
T m_value;
};
#ifdef USE_ALIAS_TEMPLATES // [2]
template<typename T>
using AwesomeClassWithOne = AwesomeClass<T, 1>;
template<typename T>
using AwesomeClassWithTwo = AwesomeClass<T, 2>;
#else // [4]
template<typename T>
class AwesomeClassWithOne: public AwesomeClass<T, 1>
{
using AwesomeClass<T, 1>::AwesomeClass;
};
template<typename T>
class AwesomeClassWithTwo: public AwesomeClass<T, 2>
{
using AwesomeClass<T, 2>::AwesomeClass;
};
AwesomeClassWithOne(int) -> AwesomeClassWithOne<int>;
AwesomeClassWithTwo(int) -> AwesomeClassWithTwo<int>;
#endif
int main()
{
AwesomeClassWithOne firstObj(20); // [3]
AwesomeClassWithTwo secondObj(20);
return firstObj.get() + secondObj.get();
}
I have a class, AwesomeClass
[1], which takes two template parameters, typename T
and int V
. V
is an internal detail that I don't want to expose to users of the class, so instead I want to provide aliases such as AwesomeClassWithOne
and AwesomeClassWithTwo
which take only T
as a template parameter, and have V
already bound to some value (1 and 2 respectively in my example).
Alias templates seemed to be appropriate for this [2]. However, as of C++17, it appears that argument deduction cannot be used with alias templates, so I can't do [3].
So I came up with an alternative [4] which involves creating new derived classes for each "alias" type I want, inheriting all the constructors from the base class, then creating deduction guides to make automatic deduction work (base class constructors don't seem to result in automatic deduction guides like normal constructors do; related question).
Does this seem like an appropriate workaround? Can I expect to see any weird side-effects compared to the alias template solution?