We are finding some strange behaviour with the decimals being displayed to the console. This is best discribed by the following code:
string num1AsString = "1.0000";
decimal num1AsDecimal = decimal.Parse(num1AsString);
string num2AsString = "1";
decimal num2AsDecimal = decimal.Parse(num2AsString);
Console.WriteLine(num1AsDecimal);
Console.WriteLine(num2AsDecimal);
The output to the Console is:
1.0000
1
However num1AsDecimal and num2AsDecimal show up in the debugger as 1 only. How is num1AsDecimal somehow preserving the .0000? It seems to be somehow holding onto its string representation. Is there some kind of boxing going on here?
Billy Stack