0

If you have code like this:

    class Program
    {
        static void Main(string[] args)
        {
            Test test = new Test();
            test.PropOne = 123;
            test.PropTwo = "testing";

            ModifyTestClassWithRef(ref test);
        }

        public static void ModifyTestClassWithRef(ref Test test)
        {
            test.PropOne++;
            test.PropTwo += "_abc";
        }
    }

    public class Test
    {
        public int PropOne { get; set; }
        public string PropTwo { get; set; }
    }

Why does the compiler not provide a warning about "ref" not being necessary since it is unnecessary in this particular situation?

David Klempfner
  • 8,700
  • 20
  • 73
  • 153
  • 3
    The `ref` is not "unnecessary". It changes the behavior http://stackoverflow.com/questions/961717/what-is-the-use-of-ref-for-reference-type-variables-in-c and also http://www.yoda.arachsys.com/csharp/parameters.html – Eric J. Mar 09 '16 at 05:18
  • I'm asking about the code in question. In this situation "ref" is certainly unnecessary. Why does the compiler not provide a warning since I am not changing the actual reference within the method? – David Klempfner Mar 09 '16 at 05:21
  • 2
    Eric Lippert has a wonderful blog post about that somewhere (googled, couldn't find it). Core answer is because someone out have to specify, code, qa and document that feature. That potential requirement competes with hundreds of other requirements. – Eric J. Mar 09 '16 at 05:22
  • Compiler features are not all implemented by default until someone comes up with a reason for them to not be implemented. It is the reverse. All features are unimplemented until it's shown that there is compelling evidence that the feature is worth implementing, and over every single other possible feature. This didn't make the cut for the same reason all of the other hundreds of thousands of features didn't; there was another feature that was considered more valuable. – Servy Mar 09 '16 at 05:24
  • 4
    There's a good likelihood of false positive with this warning. For example, an interface method that accepts `ref` is forced to declare it with `ref` even if it never changing the ref. Or maybe you want to reserve the right to change the `ref` in a future version of the method, although you don't intend to take advantage of it right now. Or a derived class is going to override it, and the derived class will change the ref. Or the method is invoked via reflection, and the reflected invoke passes a ref. High likelihood of false positive is bad news for a compiler warning. – Raymond Chen Mar 09 '16 at 05:26
  • Why don't you post these comments as answers so I can select as the accepted answer? – David Klempfner Mar 09 '16 at 05:29
  • Whoever voted to close obviously has not read my question properly. "ref" is necessary, but NOT in every situation. I have provided a concrete example of where it is NOT necessary. – David Klempfner Mar 09 '16 at 05:39
  • 2
    I wrote an article describing the conditions under which a warning might be considered. Your proposed warning does not meet a couple of the criteria. https://blogs.msdn.microsoft.com/ericlippert/2011/03/03/danger-will-robinson/ – Eric Lippert Mar 09 '16 at 06:07
  • 1
    In particular: a warning should be for code that is almost certainly *wrong*. That the code actually does something *different* than the developer of the code intended it to do. In your example, what is the *harm* caused by the unnecessary ref? If there's no harm, then why should there be a warning? – Eric Lippert Mar 09 '16 at 06:09
  • The actual result is not affected. However the "harm" could be that a beginner programmer reading this code might think that ref is necessary to have changes to an object's properties persist outside the method. – David Klempfner Mar 09 '16 at 06:11
  • 2
    But of course most important: the compiler team does not have to provide a justification for *not* doing a feature. Rather, features must be justified by their compelling benefits. I'm not seeing the compelling benefit of your proposed feature; surely the compiler team could spent their time on more valuable feature work, bug fixing, and so on. – Eric Lippert Mar 09 '16 at 06:11
  • Why did I get down voted? What is wrong with my question? – David Klempfner Mar 09 '16 at 06:33

1 Answers1

0

There shouldn't be a warning for the issue you raise. A lot of developers go to the trouble to understand and clear all their warnings. Some teams even elevate their warnings to errors. If an interface or abstract base class "demand" that a method have a ref parameter, and the implementation didn't need to set the ref, then the implementer would get the warning and would have to explicitly decorate the method with a exclusion. I'm certain that would be unsatisfactory in a lot of shops.

An out parameter, on the other hand, has the explicit guarantee that the implementor set the parameter - you can't even compile without setting it. If that's the behavior you want, then out parameter is your friend.

Clay
  • 4,999
  • 1
  • 28
  • 45