1

I learned that precision is the total number of digits in a value and scale is the number of digits after the decimal point. So as far as I understand the number 1234.56 would have precision = 6 and scale = 2.

I also saw that when you define a precision with MathContext and then create a BigDecimal with a bigger precision than defined, the output is transformed, e.g

    MathContext mc = new Mathcontext (6) 
    BigDecimal bd = new BigDecimal ("1234567", mc)
    System.out.println(bd);
    //bd is printed as 1.23457E+6

But when I add a setScale when creating the BigDecimal, the BigDecimal is being printed right, no matter the precision defined in the MathContext, e.g.

    MathContext mc = new Mathcontext (6) 
    BigDecimal bd = new BigDecimal ("1234567", mc).setScale(1)
    System.out.println(bd);
    //bd is printed as 12345670.0

Why does it show the BigDecimal as 12345670.0, having a precision = 9? Is it even worth worrying about? I can imagine that it may be just some unexpected effect that does not have any rules/logic behind it but I am curious as to what I may be missing.

  • For me the second snippet prints `1234570.0`. It lost some precision - it is rounded up. Setting the scale basicly forces it to display decimal places from what I can tell. – Amongalen Jun 22 '20 at 12:22
  • `new BigDecimal ("1234567", mc)` creates a number with 6 signifiant digits (which could be displayed as 1.23457E+07 or 1234570). This is a BigDecimal with precision = 6 and scale = -1. Such a negative scale means that the first digit on the right side of the decimal point is equal to 0.Then when you change the scale of this BigDecimal from -1 to 1, it considers that this BigDecimal represents now a value with a significant digit on the right side of the decimal point. And, evidently this digit is null then the value become 1234570.0 – Pierre Demeestere Nov 15 '22 at 21:25

0 Answers0