I am currently experimenting with the nullable reference type features of C# 8.0 in .NET 5 (C# 9.0) and I noticed that the compiler correctly checks the nullability of out parameters in TryParse
and TryGetValue
. For example:
var dict = new Dictionary<int, SomeClass>()
if (!dict.TryGetValue(key, out SomeClass? result))
dontUseValue(result); // C# annotates this with: 'result' may be null here
useValue(result); // C# annotates this with: 'result' is not null here
While this is of course true for this specific class (and try get pattern) how does C# infer this information that only if the return value is true the result is non-null?
For example if I were to create my own class with a TryGetValue function with some own functionality how could I tell the compiler when the out parameter will be non-null?
class DictionaryLike<TRes> where TRes : class, new() {
public bool TryGetValue<TRes>(int key, out TRes? result) {
if (predicate(key)) { result = null; return true; }
result = new TRes(); return false;
// How do I tell C# that the behaviour is inverted?
}
}
Sorry if this is a stupid question but I genuinely can't figure out why the Try pattern works.