I know that certain numbers will get slight variations from their original value.
- Eg.
0.1 + 0.2
->0.30000000000000004
.
But if I do Math.round(0.30000000000000004 * 100) / 100
, I will get the correct answer -> 0.3
.
I ran a Javascript test and found that the results will accurate at least up to 1e+10
.
Are there any caveats to doing this?
If I use Math.round(result * 100) / 100
after every calculation, can I be sure the results will be accurate?
The only calculations I plan to make are addition and multiplication and all numbers will only have 2 decimal places as confirmed by Math.round(n * 100) / 100
.
I don't need the numbers to be accurate over about $1000
.
Can I be sure my results will be accurate to the nearest cent?