While trying to convert some existing code to support unicode characters this problem popped up. If i try to pass a unicode character (in this case im using the euro symbol) into any of the *wprintf functions it will fail, but seemingly only in xcode. The same code works fine in visual studio and I was even able to get a friend to test it successfully with gcc on linux. Here is the offending code:
wchar_t _teststring[10] = L"";
int _iRetVal = swprintf(_teststring, 10, L"A¥€");
wprintf(L"return: %d\n", _iRetVal);
// print values stored in string to check if anything got corrupted
for (int i=0; i<wcslen(_teststring); ++i) {
wprintf(L"%d: (%d)\n", i, _teststring[i]);
}
In xcode the call to swprintf will return -1, while in visual studio it will succeed and proceed to print out the correct values for each of the 3 chars (65, 165, 8364).
I have googled long and hard for solutions, one suggestion that has appeared a number of times is using a call such as:
setlocale(LC_CTYPE, "UTF-8");
I have tried various combinations of arguments with this function with no success, upon further investigation it appears to be returning null if i try to set the locale to any value other than the default "C".
I'm at a loss as to what else i can try to solve this problem, and the fact it works in other compilers/platforms just makes it all the more frustrating. Any help would be much appreciated!
EDIT: Just thought i would add that when the swprintf call fails it sets an error code (92) which is defined as:
#define EILSEQ 92 /* Illegal byte sequence */