In Visual Studio C++ I have defined a series of channelID constants with decimal values from 0 to 15. I have made them of type uint8_t for reasons having to do with the way they are used in the embedded context in which this code runs.
When hovering over one of these constants, I would like intellisense to show me the numeric value of the constant. Instead, it shows me the character representation. For some non-printing values it shows an escaped character representing some ASCII value, and for others it shows an escaped octal value for that character.
const uint8_t channelID_Observations = 1; // '\001'
const uint8_t channelID_Events = 2; // '\002'
const uint8_t channelID_Wav = 3; // '\003'
const uint8_t channelID_FFT = 4; // '\004'
const uint8_t channelID_Details = 5; // '\005'
const uint8_t channelID_DebugData = 6; // '\006'
const uint8_t channelID_Plethysmography = 7; // '\a'
const uint8_t channelID_Oximetry = 8; // 'b'
const uint8_t channelID_Position = 9; // ' ' ** this is displayed as a space between single quotes
const uint8_t channelID_Excursion = 10; // '\n'
const uint8_t channelID_Motion = 11; // '\v'
const uint8_t channelID_Env = 12; // '\f'
const uint8_t channelID_Cmd = 13; // '\r'
const uint8_t channelID_AudioSnore = 14; // '\016'
const uint8_t channelID_AccelSnore = 15; // '\017'
Some of the escaped codes are easily recognized and the hex or decimal equivalents easily remembered (\n == newline == 0x0A)
but others are more obscure. For example decimal 7 is shown as '\a', which in some systems represents the ASCII BEL character.
Some of the representations are mystifying to me -- for example decimal 9 would be an ASCII tab, which today often appears as '\t', but intellisense shows it as a space character.
Why is an 8-bit unsigned integer always treated as a character, no matter how I try to define it as a numeric value?
Why are only some, but not all of these characters shown as escaped symbols for their ASCII equivalents, while others get their octal representation?
What is the origin of the obscure symbols used? For example, '\a' for decimal 7 matches the ISO-defined Control0 set, which has a unicode representation -- but then '\t' should be shown for decimal 9. Wikipedia C0 control codes
Is there any way to make intellisense hover tips show me the numeric value of such constants rather than a character representation? Decoration? VS settings? Typedefs? #defines?