When you say enum num : char
, then you express the fact that num
is internally implemented in terms of a char
but can still be automatically converted to an integer value, which is not necessarily char
.
As the page you cite says:
Values of unscoped enumeration type are implicitly-convertible to
integral types.
See Why does a value of an enum with a fixed underlying type of char resolve to fct(int) instead of fct(char)? for an interesting discussion on problems in the C++ standard's wording regarding the combination of integral promotion and fixed underlying types.
In any case, you can imagine this whole thing like a class with a private char
member variable and a public int
conversion operator:
// very similar to your enum:
class num {
private:
char c;
public:
num(char c) : c(c) {}
operator int() const {
return c;
}
};
num two { '2' };
std::cout << two; // prints 50
To increase type safety and make the std::cout
line a compilation error, just turn the enum
into an enum class
:
enum class num : char
This is again similar to the imagined class num
above, but without the conversion operator.
When you feed an instance of num
to std::cout
, then you are a client of num
and are not logically supposed to think that the output format will take its internal char
implementation into account.
To get more control over the output format, you should instead do like with any other custom type, and overload operator<<
for std::ostream
. Example:
#include <iostream>
enum class num : char {
zero = '0',
one = '1',
two = '2',
three = '3',
four = '4',
five = '5',
six = '6'
};
std::ostream& operator<<(std::ostream& os, num const& n)
{
switch (n)
{
case num::zero: os << "Zero"; break;
case num::one: os << "One"; break;
case num::two: os << "Two"; break;
case num::three: os << "Three"; break;
// and so on
}
return os;
}
int main()
{
std::cout << num::two; // prints "Two"
}
Of course, the specific char
values of the enum instances have now become pretty useless, so you may as well get rid of them completely:
enum class num : char {
zero,
one,
two,
three,
four,
five,
six
};
This may strike you as strange, but keep in mind that an enum which represents nothing but generic numbers from zero to six is not a realistic use case.