Windows defines the wchar_t
symbol to be 16 bits long. However, the UTF-16
encoding used tells us that some symbols may actually be encoded with 4 bytes (32 bits).
Does this mean that if I'm developing an application for Windows
, the following statement:
wchar_t symbol = ... // Whatever
might only represent a part of the actual symbol?
And what will happen if I do the same under *nix
, where wchar_t
is 32 bits long?