I've got an NSNumberFormatter set up to format using significant digits, so it should only show as many decimal digits as are necessary to correctly display the value.
When it is used to format 7.0
it works exactly as expected and produces @"7"
, but when it is used to format 0.0
the formatter produces @"0.0"
instead of @"0"
as would be expected.
NSNumberFormatter *doubleValF = [[NSNumberFormatter alloc] init];
[doubleValF setNumberStyle:NSNumberFormatterDecimalStyle];
[doubleValF setRoundingMode:NSNumberFormatterRoundHalfUp];
doubleValF.maximumFractionDigits = 9;
doubleValF.minimumFractionDigits = 0;
doubleValF.minimumSignificantDigits = 0;
doubleValF.maximumSignificantDigits = 30;
[doubleValF setUsesSignificantDigits:YES];
double value1 = 0.0;
NSString *value1String = [doubleValF stringFromNumber:[NSNumber numberWithDouble:value1]];
double value2 = 7.0;
NSString *value2String = [doubleValF stringFromNumber:[NSNumber numberWithDouble:value2]];
NSLog(@"value1=%@", value1String);
NSLog(@"value2=%@", value2String);
When I run this code, I get the following output:
2012-12-15 15:57:03.425 StackOverflowTests[41701:11303] value1=0.0
2012-12-15 15:57:03.426 StackOverflowTests[41701:11303] value2=7