How many bits does JavaScript use to represent a number?
-
1possible duplicate: http://stackoverflow.com/questions/2802957/number-of-bits-in-javascript-numbers – Cyclonecode Aug 10 '12 at 05:39
3 Answers
Generally JS implementations use 64-bit double-precision floating-point numbers. Bitwise operations are performed on 32-bit integers.

- 84,912
- 12
- 139
- 238
That depends on the specific implementation, not the language itself.
If you want to know what range of numbers is supported, then see section 8.5 (The Number Type) of the specification.

- 914,110
- 126
- 1,211
- 1,335
From the referenced spec:
The Number type has exactly 18437736874454810627 (that is, 264−253+3) values, representing the doubleprecision 64-bit format IEEE 754 values as specified in the IEEE Standard for Binary Floating-Point Arithmetic, except that the 9007199254740990 (that is, 253−2) distinct "Not-a-Number" values of the IEEE Standard are represented in ECMAScript as a single special NaN value. (Note that the NaN value is produced by the program expression NaN.) In some implementations, external code might be able to detect a difference between various Not-a-Number values, but such behaviour is implementation-dependent; to ECMAScript code, all NaN values are indistinguishable from each other.
That said be aware that when using the bit operators &, ^, >> << etc only the least significant 32 bits are used and the result is converted to a signed value.

- 43,673
- 4
- 57
- 93

- 15,685
- 6
- 28
- 34
-
It's worth pointing out that this wall of language-spec legalese entails that *all primitive JS values fit into 8 bytes*, i.e. one does not need any tagged unions to represent them, because there is plenty of space in those 9007199254740990 distinct NaNs to both tag and represent all other primitive values. – Andrey Tyukin Nov 27 '21 at 02:22