Update: I have now been informed that the OP wants to compile for C, rather than C++ and have therefore added the appropriate language tag to the question.
The types in the answer below are also available in C11 onwards, so the rest of this answer still applies.
Try changing your UTF16 typedef to char16_t. This works for me.
For UTF32, either char32_t
or wchar_t
will work.
Please note that when assigning variables from string literals, the types of the string literals must match, like so (this code requires C11 / C++11 or later):
int main ()
{
const char16_t *s1 = u"s1 is char16_t";
const char32_t *s2 = U"s2 is char32_t";
const wchar_t *s3 = L"s3 is wchar_t";
return 0;
}
And when I put a breakpoint on the return statement and print these variables in the console, I get this:
(lldb) p s1
(const char16_t *) $0 = 0x0000000100000f44 u"s1 is char16_t"
(lldb) p s2
(const char32_t *) $1 = 0x0000000100000f58 U"s2 is char32_t"
(lldb) p s3
(const wchar_t *) $2 = 0x0000000100000f7c L"s3 is wchar_t"
(lldb)
You can also see them by hovering the mouse over them or in the 'Locals' window.
The full gamut of string literals available in C11 / C++11 is described here (C) and here (C++).
To answer the question you pose in the comments, no header file is needed. These are all built-in types. If your compiler is objecting to these please check the C / C++ version you are using in the build settings for your project in Xcode.