-1

in Javascript:

This expression (2.0 - 1.1) returns 0.8999999999999999.

However this one (4.0 - 1.1) return 2.9.

Can anybody explain this inconsistency?

I understand that 1.1 can't be represented in floating point.

1 Answers1

4

The "inconsistency" has nothing to do with floating-point subtraction, and is instead a consequence of how Javascript displays numbers (it's also completely consistent).

As you say, 1.1 isn't representable, so it's value isn't actually 1.1. Instead it has the value:

1.100000000000000088817841970012523233890533447265625

Subtracting this value from 2.0 is exact, and the result is:

0.899999999999999911182158029987476766109466552734375

This number is not the closest representable number to 0.9 (that would be 0.90000000000000002220446049250313080847263336181640625), so if it printed as 0.9, the value would not survive a round trip to decimal and back (being able to round-trip numbers to their decimal representation and back is a highly desirable property). Thus, it is instead printed with sufficient precision to specify the correct value:

0.8999999999999999

In your second example, we subtract from 4.0 and get:

2.899999999999999911182158029987476766109466552734375

which is the closest representable number to 2.9 (the next closest number is 2.9000000000000003552713678800500929355621337890625), so it can be printed as just 2.9 and still have the value survive a round-trip.

Stephen Canon
  • 103,815
  • 19
  • 183
  • 269