0

I want to know whether there is a standard if I use uint16_t type for a variable and want to print it out on screen by using std::cout <<, it is printed as number not char. I read on some tutorial websites that uint8_t or int8_t prints doesn't guarantee as character or number. Yet, I can't find any clause or quotation from standard. Do I need always casting to be sure to get the wished output for all these mentioned types?

  • I don't believe there's a guarantee. `int8_t` is very likely to be a typedef for `char` or `signed char` on most implementations. `uint16_t` is likely an `unsigned short` on most implementations - but the standard doesn't prohibit an implementation where `char` is 16 bits large. – Igor Tandetnik Feb 24 '19 at 13:49
  • @IgorTandetnik: "*the standard doesn't prohibit an implementation where char is 16 bits large.*" Actually, it does. The `int*_t` types are required to be *exactly* that size. And `char` is required to be the *smallest* possible type; a "byte" in C++ is defined as the `sizeof(char)`, which shall be 1. Therefore, it's impossible to implement `int16_t` as a `char`, since that would leave no smaller type for `int8_t`. – Nicol Bolas Feb 24 '19 at 14:30
  • 2
    @NicolBolas The implementation is not required to provide a type for every width. "**[cstdint.syn]/2** The header defines all types and macros the same as the C standard library header ``." The C standard: "**7.20/4** ... Conversely, for each type described herein that the implementation does not provide, `` shall not declare that typedef name..." – Igor Tandetnik Feb 24 '19 at 14:44
  • @NicolBolas only `int_leastN_t` and `int_fastN_t` are required. [Fixed-width types will not be available if the implementation can't provide them](https://stackoverflow.com/a/5254157/995714). If CHAR_BIT > 8 on some platform then obviously int8_t won't exist there – phuclv Sep 04 '19 at 14:43
  • 1
    @phuclv: I know they're not required, but if you're going to ask whether `uint16_t` is a typedef for an `unsigned char`, then *by definition* you're assuming that `uint16_t` is a thing your implementation provides. Otherwise the question is meaningless. – Nicol Bolas Sep 04 '19 at 15:12
  • @NicolBolas I'm taking about what you said *Therefore, it's impossible to implement `int16_t` as a `char`, since that would leave no smaller type for `int8_t`* which is wrong. If `CHAR_BIT == 16` (which is allowed by the standard) then `int8_t` doesn't exist and `uint16_t` can be typedef'ed as `char`, in which case `std::cout << uint16_t(x)` can print out a character – phuclv Sep 04 '19 at 15:18
  • @phuclv: "*I'm taking about what you said*" Yes, but Igor already quoted the standard showing that I was incorrect. – Nicol Bolas Sep 04 '19 at 15:20

1 Answers1

1

If uint8_t is defined by the implementation, then uint16_t cannot be a char. The reason being that char is defined to be the smallest addressable value; sizeof(char) is explicitly 1.

Since these optional sized types are required to be exactly of the specified size, sizeof(uint16_t) is required to be greater than sizeof(uint8_t). And since sizeof(uint8_t) cannot be smaller than sizeof(char), sizeof(uint16_t) can't be 1. So it can't be a char.

Of course, if uint8_t is not defined but uint16_t is defined, then all bets are off. But if your code already assumes that both uint8_t and uint16_t exist, then you can likewise assume that uint16_t isn't a char.

Nicol Bolas
  • 449,505
  • 63
  • 781
  • 982