I'm using a database to store some money information on an application I'm developing, but although all the rows of the table store decimal numbers with maximum two decimal places a SELECT SUM(value) FROM income
returns a number with more than ten decimal places (15327.880000000005).
I discovered that SQL (or at least SQLite) implicitly converts datatypes on querys, so it's probably converting to REAL and apparently that's where my problem is comming from.
I've tried changing the column type from DECIMAL to DECIMAL(10,2), NUMERIC and even INTEGER, but the database seems to ignore it and give me the same results with many decimal places.
As I'm dealing with money, lack of precision is not really welcome, so how can I stop the querys from returning unprecise results?