1

I am currently experimenting with the nullable reference type features of C# 8.0 in .NET 5 (C# 9.0) and I noticed that the compiler correctly checks the nullability of out parameters in TryParse and TryGetValue. For example:

var dict = new Dictionary<int, SomeClass>()
if (!dict.TryGetValue(key, out SomeClass? result))
    dontUseValue(result);  // C# annotates this with: 'result' may be null here
useValue(result); // C# annotates this with: 'result' is not null here

While this is of course true for this specific class (and try get pattern) how does C# infer this information that only if the return value is true the result is non-null?

For example if I were to create my own class with a TryGetValue function with some own functionality how could I tell the compiler when the out parameter will be non-null?

class DictionaryLike<TRes> where TRes : class, new() {
    public bool TryGetValue<TRes>(int key, out TRes? result) {
        if (predicate(key)) { result = null; return true; }
        result = new TRes(); return false;
        // How do I tell C# that the behaviour is inverted?
    }
}

Sorry if this is a stupid question but I genuinely can't figure out why the Try pattern works.

Michael Chen
  • 574
  • 3
  • 11
  • 1
    Because it's annotated with `[NotNullWhen(returnValue: true)]`, or in your case, should be with `false`, see duplicate. – CodeCaster Feb 10 '21 at 10:49
  • The attributes CodeCaster mentions are documented here: https://learn.microsoft.com/en-us/dotnet/csharp/language-reference/attributes/nullable-analysis – Heinzi Feb 10 '21 at 10:50

0 Answers0