Perhaps my Google-Fu has failed me, but I haven't been able to determine if comparing a nullable in .NET will always be less than something else.
I've got some code similar to this
MyClass findLatest(List<MyClass> items){
DateTime? latest_tstamp = null;
MyClass latest_item = null;
foreach(var item in items){
if (latest_tstamp < item.tstamp){
latest_tstamp = item.tstamp;
latest_item = item;
}
}
return latest_item;
}
It's seemed to work in the few limited cases I've tried (item.tstamp
is declared DateTime? tstamp
as well, of course).
Is this guaranteed behavior?
Conclusion(?)
Based on the answers (and Jon Skeet's [answer on another question]), I've gone with the following check:
if (item.tstamp != null &&
(latest_tstamp == null || latest_tstamp < item.tstamp)){
// do stuff
}