var a = 2000000000000000000;
a = a - 1; //Doesn't work
a = a + 1; //Doesn't work
But
a = a * 10; //Works
a = a / 10; //Works
Can anyone explain why this happens?
You can try this example in browser console.
var a = 2000000000000000000;
a = a - 1; //Doesn't work
a = a + 1; //Doesn't work
But
a = a * 10; //Works
a = a / 10; //Works
Can anyone explain why this happens?
You can try this example in browser console.
You're running into the limits of JavaScript integer calculus.
The largest integer number, JavaScript can use with full precision is 9007199254740992.
Your number is larger and thus will be internally represented as a double
. As a consequence you run into all the problems associated with floating point calculus, which you see here.