0

I'm trying to convert "á" in UTF8. I can find that "á" in UTF8 is "á". In Xcode I've written this:

const char *  litter = [@"á" UTF8String];
NSLog(@"%s",litter);

And it always prints "√°" and that is wrong! Indeed if a write

const char *  litter = [@"áéíóú" UTF8String];
NSLog(@"%s",litter);

It prints: áéíóú

What am I doing wrong?

Thank you

Georgevik
  • 539
  • 1
  • 5
  • 18

1 Answers1

0

You are not doing anything wrong. Its just that the %s specifier just prints out the stream of characters with no unicode awareness. If you want to check if litter contains the correct characters, convert it back to NSString object and then NSLog it.

const char *  litter = [@"á" UTF8String];
NSLog(@"%@",[NSString stringWithUTF8String:litter]);
Kedar
  • 1,298
  • 10
  • 20
  • But in that case, it prints: "á" not "á" – Georgevik Oct 15 '13 at 06:22
  • You need to distinguish between what you get and what you see. Your statement that "á" in UTF8 is "á" is absolute nonsense. "á" in UTF8 is C3 A1. You seem to expect seeing strings with code points U+00C3 and U+00A1. That's not going to happen. – gnasher729 Mar 11 '16 at 10:29