3

I'm using AlamoFire to call my web service:

ApiManager.manager.request(.GET, webServiceCallUrl,  parameters: ["id": 123])
        .validate()
        .responseJSON { response in
            switch response.result {
            case .Success:
                print(response.result.value!)

                //...

            case .Failure:
                //...
            }
    }

My web service returns the following JSON:

{
    //...
    "InvoiceLines": [{
        "Amount": 0.94
    }]
}

Alamofire is treating this as a double instead of a decimal so in the output console I get:

{
    //...
    InvoiceLines =     (
                {
            Amount = "0.9399999999999999";
        }
    );
}

This then causes rounding errors further down in my code.

I've used Fiddler on the server to inspect the web service JSON response to confirm that it is returning 0.94. So from that I can rule out the server being the issue and suspect the responseJSON causing my issue.

How can I get currency values to return as the correct NSDecimalNumber value?


Extra information after Jim's answer/comments:

var stringToTest = "{\"Amount\":0.94}"
var data = stringToTest.dataUsingEncoding(NSUTF8StringEncoding, allowLossyConversion: false)
var object = try NSJSONSerialization.JSONObjectWithData(data!, options: opt)

var a = object["Amount"]
print(a) //"Optional(0.9399999999999999)"

var b = NSDecimalNumber(decimal: (object["Amount"] as! NSNumber).decimalValue)
print(b) //"0.9399999999999999"

If I pass through the value as a string in the JSON I can then get my desired result:

var stringToTest = "{\"Amount\":\"0.94\"}"
var data = stringToTest.dataUsingEncoding(NSUTF8StringEncoding, allowLossyConversion: false)
var object = try NSJSONSerialization.JSONObjectWithData(data!, options: opt)

var c = NSDecimalNumber(string: (object["Amount"] as! String))
print(c) //0.94

However I don't want to have implement changes to the API so need a solution which keeps the JSON in the same format. Is my only option to round the double each time? It just seems feels like the wrong way to do things and potentially raise rounding problems in the future.

Joseph
  • 2,706
  • 6
  • 42
  • 80

1 Answers1

2

Decimal values are themselves floating-point, and the JSON spec specifically allows high-exponent values which you might think of as "real" floating point:

http://www.ecma-international.org/publications/files/ECMA-ST/ECMA-404.pdf Section 8 "numbers"

What you're describing as "decimal" is decimal floating point, in which you have an integer and a base10 exponent, i.e. 94 * 10^-2 in this case. Very few systems support this storage format and instead use binary floating point, which is an integer and a base2 exponent. Since 2 is not a factor of 5, tenths, hundredths et cetera can't be precisely represented in binary floating point. This means that the internal representation is always a little off, but if you try to do comparisons or print it is aware of the precision and will work as you expect.

var bar = 0.94 // 0.939999999999999
bar == 0.94 // true
print(bar) // "0.94\n"

Where you're hitting problems in other code will be due to compounding the precision problem. In decimal floating point, if you try to do 1/3 + 1/3 - 2/3 you get 0.333333 + 0.333333 - 0.666667 = -0.000001 != 0, and in binary floating point you get the same issue. This is because after one or more operations the value will be outside the precision range for converting from the original number, so it is taken literally.

If you're in this kind of situation with data which is actually fixed point (such as currency values in most cases) the safest approach is to multiply it up then only use integer values until you have to display, as in:

var bar = 0.94 * 100
var foo = bar * 5 // multiply price by 5
print(bar / 100)

Alternatively, since double-precision floating point has a very high base precision, far more than the 1/100 needed, you can just round when performing comparisons or output.

var bar = 0.94
var foo = bar * 5 // multiply price by 5
print(round(bar*100) / 100)
Jim Driscoll
  • 894
  • 11
  • 8
  • Thanks for the detailed explanation. I have an understanding what's going on behind the scenes with the floating-point. We use NSDecimalNumber within our app which hasn't caused issues elsewhere - it's not an option to change the entire app to multiply out to integer values for the calculations. I could simply set the NSDecimalNumber to my desired format however I've done some further testing and NSJSONSerialization seems to return fine for 0.93, 0.95, 0.96, 0.97. It's just the 0.94 that it's showing as 0.939999999999999 so I'm trying to get an understanding of why 0.94 is behaving differently – Joseph Feb 19 '16 at 11:53
  • You really don't need to worry about the internal representation of the number unless it is actually imprecise enough to not match itself. Anyway, you can try all of this directly in the playground like: `var a = 0.93 // 0.93` `var b = 0.94 // 0.93999999999` `var c = 0.95 // 0.95` It'll just be that the internal var dump routine mishandles converting to text for some numbers; probably for that number the internal representation is 0.93999999...995 and there's a minor bug causing the var dump handler to round down instead of up. You can ignore it as long as print() works. – Jim Driscoll Feb 19 '16 at 12:04
  • The issue that having is that the print() is giving the wrong values in the way I'm currently using it. I've added some extra code to my questions as a result of frustrating myself in a playground for the last 2 hours. – Joseph Feb 19 '16 at 14:07
  • Multiply and divide. `print(NSDecimalNumber(double: round(0.94*100)).decimalNumberByDividingBy(100))` works fine, while `print(NSDecimalNumber(double: 0.94))` does not. Probably some type inference problems there by the looks of the result. Please note again that the internal dump of the former is inaccurate, but actual print is fine. – Jim Driscoll Feb 21 '16 at 13:16