93

Just out of curiosity: why can I assign 0.0 to a variable that is of an enumeration type, but not 1.0? Have a look at the following code:

public enum Foo
{
    Bar,
    Baz
}

class Program
{
    static void Main()
    {
        Foo value1 = 0.0;
        Foo value2 = 1.0;   // This line does not compile
        Foo value3 = 4.2;   // This line does not compile
    }
}

I thought that conversions between numerical types and enumeration values are only allowed via casts? That is I could write Foo value2 = (Foo) 1.0; so that line 2 in Main can compile. Why is there an exception for the value 0.0 in C#?

phuclv
  • 37,963
  • 15
  • 156
  • 475
feO2x
  • 5,358
  • 2
  • 37
  • 46
  • 17
    For me it's strange, that you **can** assign double literal 0.0 to custom enum. Not that you **can't** assign `1.0` literal to custom enum. – Ilya Ivanov May 20 '14 at 14:26
  • 2
    I suspect the compiler is treating it as `0` instead. I had a similar question once and Rawling posted a great answer [here](http://stackoverflow.com/a/11086374/1193596). – Amicable May 20 '14 at 14:26
  • 2
    [IdeOne](http://ideone.com/G3XGPV) does not compile it. – 001 May 20 '14 at 14:28
  • 2
    Related question: [Strange (possibly wrong?) C# compiler behavior with method overloading and enums](http://stackoverflow.com/questions/3153841/strange-possibly-wrong-c-sharp-compiler-behavior-with-method-overloading-and) – CodesInChaos May 20 '14 at 18:17

4 Answers4

101

Jon's answer is correct. I would add to it the following points.

  • I caused this silly and embarrassing bug. Many apologies.

  • The bug was caused by me misunderstanding the semantics of an "expression is zero" predicate in the compiler; I believed it was checking only for integer zero equality, when in fact it was checking for more along the lines of "is this the default value of this type?" In fact, in an earlier version of the bug it was actually possible to assign the default value of any type to an enum! It is now only default values of numbers. (Lesson: Name your helper predicates carefully.)

  • The behaviour I was attempting to implement that I messed up was in fact a workaround for a slightly different bug. You can read the whole terrible story here: https://learn.microsoft.com/en-us/archive/blogs/ericlippert/the-root-of-all-evil-part-one and https://learn.microsoft.com/en-us/archive/blogs/ericlippert/the-root-of-all-evil-part-two (Lesson: It is very easy to introduce new worse bugs while fixing old ones.)

  • The C# team decided to enshrine this buggy behaviour rather than fixing it because the risk of breaking existing code for no compelling benefit was too high. (Lesson: get it right the first time!)

  • The code I wrote in Roslyn to preserve this behaviour can be found in the method IsConstantNumericZero in https://github.com/dotnet/roslyn/blob/master/src/Compilers/CSharp/Portable/Binder/Semantics/Conversions/ConversionsBase.cs -- see it for more details of what exactly the Roslyn behaviour is. I wrote almost all the code in the Conversions directory; I encourage you to read all of it as there are many interesting facts about how C# diverges from the specification in the comments. I decorated each with SPEC VIOLATION to make them easy to find.

One more point of interest: C# also allows any enum value to be used in an enum initializer regardless of its zeroness:

enum E { A = 1 }
enum F { B = E.A }  // ???

The spec is somewhat vague as to whether this should be legal or not, but again, as this has been in the compiler for a long time, the new compilers are likely to maintain the behaviour.

Eric Lippert
  • 647,829
  • 179
  • 1,238
  • 2,067
  • 10
    This is really cool, I finally get to see the code you wrote. It's awesome that Roslyn source code is open source. Now I completely understand that there exist valid reasons (technical/legal) to not provide change history but it would have been super awesome to see the change history to see how the code evolved. – SolutionYogi May 20 '14 at 22:26
  • `The C# team decided to enshrine this buggy behaviour rather than fixing it because the risk of breaking existing code for no compelling benefit was too high.` I don't think there'd be many people that rely on this behavior, and it's one of these oddities that may have been better to just fix. However, it doesn't really do much harm either (except for projects implementing the spec). – Aidiakapi May 21 '14 at 17:04
  • 5
    @Aidiakapi: Indeed, the number of people affected should be small; it is not zero. The C# team takes breaking changes very seriously. It's easy for *you* to say that it's better to make the fix; you don't have to deal with irate customers who call your vice president to complain that your trivial change that adds no benefit whatsoever delayed their system integration by a day. – Eric Lippert May 21 '14 at 18:40
  • 3
    It gets worse. All such breaking changes will (ideally) be listed on Microsoft's Framework migration guide. The longer this list is, the more hesitant users are to migrate their application. So, even a minor breaking change causes: 1. A small number of applications to break. 2. A small number of users to refuse to upgrade (even if the issue doesn't affect them). 3. A small number of users to waste resources evaluating if the breaking change affects them. 4. Users from #1, #2, and #3 to complain to everyone else. – Brian May 21 '14 at 20:04
  • @EricLippert If "The spec is somewhat vague", wouldn't it make sense to update the spec? (Genuine question!) – James May 24 '14 at 20:17
  • @James: The vast majority of [Eric's blog post](http://blogs.msdn.com/b/ericlippert/archive/2006/03/29/the-root-of-all-evil-part-two.aspx) (already linked in Eric's answer) is devoted to discussing the pros and cons of updating the spec. – Brian Jun 03 '14 at 18:47
100

It's a bug that you can use 0.0. The compiler implicitly treats all constant expressions with a value of zero as just 0.

Now, it's correct for the compiler to allow an implicit conversion from a constant int expression of 0 to your enum as per section 6.1.3 of the C# 5 specification:

An implicit enumeration conversion permits the decimal-integer-literal 0 to be converted to any enum-type and to any nullable-type whose underlying type is an enum-type. In the latter case the conversion is evaluated by converting to the underlying enum-type and wrapping the result (§4.1.10).

I've spoken with the C# team about this before: they'd have liked to have removed the accidental conversion from 0.0 (and indeed 0.0m and 0.0f) to enum values, but unfortunately I gather it broke too much code - even though it should never have been allowed in the first place.

The Mono mcs compiler prohibits all of these floating point conversions, although it does allow:

const int Zero = 0;
...

SomeEnum x = Zero;

despite the fact that Zero is a constant expression but not a decimal-integer-literal.

I wouldn't be surprised to see the C# specification change in the future to allow any integer constant expression with a value of 0 (i.e. to mimic mcs), but I wouldn't expect the floating point conversions to ever officially be correct. (I've been wrong before about predicting the future of C#, of course...)

Jon Skeet
  • 1,421,763
  • 867
  • 9,128
  • 9,194
  • 3
    Per the spec, it's only meant to be the *literal* 0. So it should reject `1-1` - a constant `int` expression with a value of `0`. But as you observe, the compiler is out of line with the spec here. – Damien_The_Unbeliever May 20 '14 at 14:30
  • 4
    `it broke too much code` - it's really hard to imagine any reasons to write such code. – Ilya Ivanov May 20 '14 at 14:32
  • @Damien_The_Unbeliever: Yes, I was just coming up with an example of that sort of thing (using a `const`, but it's the same sort of thing). – Jon Skeet May 20 '14 at 14:33
  • What I find interesting, is that if the values are implicit, or the enum has an explicit 0 value, then the variable equates to that value. *But* if you set the values explicitly and have no zero value. Whilst the set works, it never evaluates to one of the enum values. – Obsidian Phoenix May 20 '14 at 14:34
  • 1
    @ObsidianPhoenix: I'm not sure what you mean. It's exactly equivalent to: `SomeEnum x = (SomeEnum) 0;`. That's the case whether there's a named zero value or not. – Jon Skeet May 20 '14 at 14:35
  • If you use `enum Test { Foo = 0, Bar = 1 }` and assign `Test v1 = 0`, v1 will equate to `Foo`. However, if it's `{Foo = 1, Bar = 2}`, although the assignment works, you can never get a true from `v1 == Test.Foo` or `v1 == Test.Bar` – Obsidian Phoenix May 20 '14 at 14:36
  • 2
    @ObsidianPhoenix: Well no, because the value of `Test.Foo` is 1, not 0... again, that's exactly the same as if you wrote `Test v1 = (Test) 0;` - and that behaviour holds for *any* value which isn't a named value in the enum. – Jon Skeet May 20 '14 at 14:38
  • @ObsidianPhoenix - you're always allowed to cast any value of the base type to the enum type without error - that's explicitly mentioned in the spec also – Damien_The_Unbeliever May 20 '14 at 14:42
  • @JonSkeet ah, never mind. For some reason I thought it actively errored on any other number that wasn't in the enum - but not 0. – Obsidian Phoenix May 20 '14 at 14:45
  • 2
    @JonSkeet is it going to be fixed in Roslyn? – Max May 20 '14 at 15:22
  • @Max: I doubt it. Certainly the current build still allows those conversions. – Jon Skeet May 20 '14 at 15:31
  • 1
    @Max: See my answer for details. – Eric Lippert May 20 '14 at 20:39
  • that's why I love stackoverflow – Max May 21 '14 at 07:34
  • The same bug is there with wide integers (not only floating point types). Given that `DayOfWeek` has `int` as its underlying type, this should not be legal from reading the spec: `const ulong w = 0ul; DayOfWeek d = w;` But it is allowed (and gives `Sunday` of course). There is no implicit conversion (not even in the constant-expression case) from `ulong` down to `int`. And of course `w` is not a _decimal-integer-literal_. – Jeppe Stig Nielsen May 21 '14 at 12:18
10

Enumerations in C# are by definition integral values. For consistency C# shouldn’t accept either of these assignments, but 0.0 is silently treated as integral 0. This is probably a holdover from C, where the literal 0 was treated specially and could essentially take any given type – integer, floating point number, null pointer … you name it.

Konrad Rudolph
  • 530,221
  • 131
  • 937
  • 1,214
3

enum is really intended (in all languages which support it) to be a way to work with meaningful and unique strings (labels) rather than a numeric values. Thus, in your example, you should only be using Bar and Baz when dealing with a Foo enumerated data type. You should never use (compare to, or assign) an integer, even though many compilers will let you get away with it (enums are usually integers internally), and in this case, a 0.0 is carelessly treated as a 0 by the compiler.

Conceptually, it should be all right to add an integer n to an enumerated value, to get n values further down the line, or to take val2-val1 to see how far apart they are, but unless the language specification explicitly allows this, I'd avoid it. (Think of an enumerated value as being like a C pointer, in the ways you can use it.) There's no reason enums couldn't be implemented with floating point numbers, and a fixed increment between them, but I haven't heard of this being done in any language.

Phil Perry
  • 2,126
  • 14
  • 18
  • I know that I shoudn't use enums that way in C# - but I found this brain teaser and wanted to know why 0.0 is working, but 1.0 is not. I knew that it had to be something with the C# compiler because you can see that the IL Code for `Foo v1 = 0.0;` is the same as for `Foo v2 = Foo.Bar`. – feO2x May 21 '14 at 09:29