0

My application is using a double in Java for representing floating point numbers. double has a limitations of 17 significant digits. we are considering to refactor the code to use BigDecimal - however i couldn't find any information in the documentation regarding the number of significant digits that BigDecimal is allowing to use.

Any information regarding this will be very helpful, thanks in advance !

Burkhard
  • 14,596
  • 22
  • 87
  • 108
Oren Levi
  • 63
  • 7

2 Answers2

3

The answer is "probably more digits than you are ever likely to need" ... but strictly speaking there are a couple of limits.

  1. The size of your JVM's heap could limit the size of a BigDecimal object that can be created. (You might get an OutOfMemoryError ...)

  2. For any implementation of BigDecimal, there is likely to be an implementation-specific limit due to the classes internal representation.

For the implementation in Java 8, a BigDecimal number's representation consists of a BigInteger to represent the digits, and an int to represent the scale factor. The javadoc for BigInteger states:

BigInteger must support values in the range -2Integer.MAX_VALUE (exclusive) to +2Integer.MAX_VALUE (exclusive) and may support values outside of that range.

So the theoretical maximum number of decimal digits of precision for a BigDecimal in Java 8 is at least log10(2231 - 1 ).


Any information regarding this will be very helpful.

The definitive source of information is the source code of BigDecimal and the classes that it depends on.

Stephen C
  • 698,415
  • 94
  • 811
  • 1,216
1

From the JavaDoc:
BigDecimal is Immutable, arbitrary-precision signed decimal numbers.

So the number of significant digits is infinite and only limited by your memory.

Burkhard
  • 14,596
  • 22
  • 87
  • 108