console.log(parseInt(0.0000008))
// > 8
console.log(parseInt(0.000008))
// > 0
console.log(parseInt(0.0000008, 10))
// > 8
console.log(parseInt(0.000008, 10))
// > 0
The above code was run in Google Chrome Version 62.0.3202.94 (Official Build) (64-bit) on macOS Sierra Version 10.12.6.
As you can see, the behaviour does not depend on whether or not you specify the radix.
Note: I usually use ~~
instead of using parseInt
, it looks safer.
Why am I getting these results?