I have a dataset where every row receives a % allocation to one of three categories. I would like to add an "Unclassified" category, which simply accounts for the leftover % that is not allocated between the three provided categories.
Category A, B, and C are all datatype float.
SELECT
[Category A],
[Category B],
[Category C],
[Category A]+[Category B]+[Category C] [CatTotal],
1-([Category A]+[Category B]+[Category C]) [Unclassified]
FROM Dataset
Yields
<table>
<tr>
<th>Category A</th>
<th>Category B</th>
<th>Category C</th>
<th>Category Total</th>
<th>Unclassified (1-Category Total)</th>
</tr>
<tr>
<td align = 'center'>0</td>
<td align = 'center'>0.333333333333333</td>
<td align = 'center'>0.666666666666667</td>
<td align = 'center'>1</td>
<td align = 'center'>2.22044604925031E-16</td>
</tr>
</table>
What's strange is that if I recreate the exact values of Category A,B, and C in a temp table, also stored as float, then run the same code I get 0 as expected. What is happening here?