0

We are finding some strange behaviour with the decimals being displayed to the console. This is best discribed by the following code:

string num1AsString = "1.0000";
decimal num1AsDecimal = decimal.Parse(num1AsString);

string num2AsString = "1";
decimal num2AsDecimal = decimal.Parse(num2AsString);

Console.WriteLine(num1AsDecimal);
Console.WriteLine(num2AsDecimal);

The output to the Console is:

1.0000
1

However num1AsDecimal and num2AsDecimal show up in the debugger as 1 only. How is num1AsDecimal somehow preserving the .0000? It seems to be somehow holding onto its string representation. Is there some kind of boxing going on here?

Billy Stack

undefined
  • 1,354
  • 1
  • 8
  • 19
bstack
  • 2,466
  • 3
  • 25
  • 38

3 Answers3

2

Here is nice explanation from Jon Skeet link, see Keeping zeros

undefined
  • 1,354
  • 1
  • 8
  • 19
2

We are finding some strange behaviour with the decimals being displayed to the console.

It's not strange at all. It's documented:

The scaling factor also preserves any trailing zeroes in a Decimal number. Trailing zeroes do not affect the value of a Decimal number in arithmetic or comparison operations. However, trailing zeroes can be revealed by the ToString method if an appropriate format string is applied.

In other words, it's the debugger behaviour which is slightly strange here, not ToString(). In the debugger if you watch num1AsDecimal.ToString(), that should preserve the trailing zeroes too.

Jon Skeet
  • 1,421,763
  • 867
  • 9,128
  • 9,194
  • Thanks for the answer. In c# 1 the trailing zeros would not have been displayed. I cant understand why they changed the decimals default display format behaviour. After all we have ways to easily display decimals padded with desired decimal places if required. – bstack May 17 '12 at 10:44
  • 1
    @bstack: That's assuming you *know* the number of decimal places to use. This is preserving more information about the value - how many decimal places you should consider to be significant. I believe the change was made to bring it in line with international standards - and presuambly because it was useful... – Jon Skeet May 17 '12 at 11:08
2

A footnote on the MSDN page of Decimal.Parse says:

The decimal result of decimal.Parse will have the same number of significant digits as the string it was parsed from. That is, "3.43" and "3.4300" while parsing to the same numeric value, result in different decimal representations. This is (somewhat) accounted for in the description of the binary representation of a decimal.

Tudor
  • 61,523
  • 12
  • 102
  • 142