Possible Duplicate:
T-SQL Decimal Division Accuracy
In SQL Server 2005, if I do the following query :
select cast(1 as decimal(38,18))/cast(150 as decimal(38,18))
it returns 0.006666 (6 decimals)
however, if I do :
select cast(1 as decimal(24,18))/cast(150 as decimal(24,18))
it returns 0.00666666666666 (14 decimals)
Could anyone please explain me those results ?