0

I'm using a database to store some money information on an application I'm developing, but although all the rows of the table store decimal numbers with maximum two decimal places a SELECT SUM(value) FROM income returns a number with more than ten decimal places (15327.880000000005).

I discovered that SQL (or at least SQLite) implicitly converts datatypes on querys, so it's probably converting to REAL and apparently that's where my problem is comming from.

I've tried changing the column type from DECIMAL to DECIMAL(10,2), NUMERIC and even INTEGER, but the database seems to ignore it and give me the same results with many decimal places.

As I'm dealing with money, lack of precision is not really welcome, so how can I stop the querys from returning unprecise results?

  • It's just normal ieee753 floating point behavior. Many numbers cannot be represented exactly and the closest approximation is used. Look into `round()` or `printf()` if you don't want full precision in the results. – Shawn Apr 04 '19 at 01:08
  • The classic q&a on the topic: https://stackoverflow.com/questions/588004/is-floating-point-math-broken – Shawn Apr 04 '19 at 01:11
  • Also read https://www.sqlite.org/datatype3.html – Shawn Apr 04 '19 at 01:15
  • I saw these links before and I got it, but there's no way I can solve this without doing it with the final result? If there was much more rows than in my case that could cause wrong values to appear, I understand that that's how floating point works but it really seems wrong to me that the database is giving me unprecise data when it should keep it with a lot of consistency on it. – Wallace Baleroni Apr 04 '19 at 17:08

0 Answers0