2

What is the best method to convert from a C++ basic_string<wchar_t> object with UTF16 encoding to an Objective-C NSString object?

Can I cast from wchar_t* to char* like so and still have stringWithCString use the string correctly?

[NSString stringWithCString:(char*)wideCharBasicString.c_str() encoding:NSUTF16StringEncoding];

Thank you, Shane

Makoto
  • 104,088
  • 27
  • 192
  • 230
iEqualShane
  • 21
  • 1
  • 2
  • A `wchar_t` on iOS is 32-bit wide IIRC. – kennytm Mar 31 '11 at 20:47
  • I just verified it is indeed 32-bit. How would I go about the same method with UTF32 encoding? – iEqualShane Mar 31 '11 at 21:25
  • So NSString *myNSString = [NSString stringWithFormat:@"%S",wideCharBasicString.c_str()]; seems to do the trick. stringWIthCString was only giving me an empty NSString. Also if anyone needs to use this like I did, the flag -fshort-wchar turns wchar_t to 16-bit on iOS. – iEqualShane Apr 07 '11 at 23:37
  • 1
    To be very general, if you guarantee that your original string is UTF-16 (not just UCS-2), you could copy it into a `uint16_t[]`, then use `iconv` to create the desired encoding (if it differs from UTF-16) and then construct your NSString from there. – Kerrek SB Jun 14 '11 at 14:12

3 Answers3

1

If the buffer is UTF-16 encoded, you could do this:

    NSData* data = [[[NSData alloc] initWithBytesNoCopy:buffer
                                                 length:length
                                           freeWhenDone:NO] autorelease];
    NSString* result = [[NSString alloc] initWithData:data
                                             encoding:NSUTF16LittleEndianStringEncoding];

stringWithCString looks like it barfs on multi-byte character buffers, stopping at the first NULL byte it finds.

[UPDATE]

I filed a bug with apple, and this is the expected behaviour, apparently. stringWithCString only supports 8-bit encodings and will stop at the first zero byte.

Matt Connolly
  • 9,757
  • 2
  • 65
  • 61
  • Okay it's old but I was in the same case : I confirm stringWithCString won't work with UTF16 in IOS 5.1 – gogoprog Jun 07 '13 at 13:17
0

I used the following to do the converison:

NSString *myNSString = [NSString stringWithFormat:@"%S",wideCharBasicString.c_str()]; 

Also if anyone needs to use this like I did, the flag -fshort-wchar turns wchar_t to 16-bit on iOS.

iEqualShane
  • 21
  • 1
  • 2
-1

This is the least ugly way to do it (utf16):

NSString* nsstr = [NSString stringWithCharacters:str.c_str() length:str.length()];
Asylum
  • 569
  • 4
  • 21