2

I have this sample code:

NSNumberFormatter *formatter = [[NSNumberFormatter alloc] init];
NSString *numberString = @"9.2";
NSNumber *number = [formatter numberFromString:numberString];
NSLog(@"String: %@, Number: %@", numberString, number);

The printed result is:

String: 9.2, Number: 9.199999999999999

I don't understand why this would happen. Am I missing a setting?

Cory Imdieke
  • 14,140
  • 8
  • 36
  • 46

1 Answers1

2

According to this reference, an NSNumber can store any ordinary C numeric type. Since this is obtained by parsing a non-integer, the two types to choose from are float and double, and by the displayed value, the chosen representation is double.

Since decimal fractions like 9.2 aren't exactly representable as doubles, you get the closest representable number. That is displayed to default precision, which is about 16 decimal places.

Daniel Fischer
  • 181,706
  • 17
  • 308
  • 431
  • Thanks for the additional info. I knew there were issues with very large or very small numbers, but didn't expect an issue like that for a seemingly-average sized number like that. Anyway thanks for pointing me in the right direction. – Cory Imdieke Apr 20 '12 at 22:30
  • The size doesn't matter, most numbers cannot be exactly represented in either decimal representation (`NSDecimalNumber`) or base 2 (what `float` and `double` use). However, the inexactness is usually very small and doesn't matter (if it does, though, things get hairy), so for 99.99% of applications at least one of `NSNumber` and `NSDecimalNumber` does fine. – Daniel Fischer Apr 20 '12 at 22:39