From Do redundant casts get optimized? I can see compiler doesn't optimizes unnecessary downcast (i.e. castclass
) away. But now I am interested in a simpler case, " if compiler optimizes unnecessary upcast away?" This question only concerns reference type, not boxing
.
It seems to my that upcast
doesn't produce any IL, and hence redundant explicit upcast
doesn't cost at all? Or because IL instruction is typeless, there is still performance cost for redundant explicit upcast
behind the scene?
Or would upcast produce any IL instructions sometimes?
class Foo
{
}
class Bar : Foo
{
}
bool Test(object x)
{
return x == null;
}
void Main()
{
var x = new Bar();
Console.Write(Test((Foo)x)); // redundant explicit to Foo, and then implicit to Object
var y = new Bar(); // implicit to Object
Console.Write(Test(y));
}
IL_0000: newobj UserQuery+Bar..ctor
IL_0005: stloc.0 // x
IL_0006: ldarg.0
IL_0007: ldloc.0 // x
IL_0008: call UserQuery.Test
IL_000D: call System.Console.Write
IL_0012: newobj UserQuery+Bar..ctor
IL_0017: stloc.1 // y
IL_0018: ldarg.0
IL_0019: ldloc.1 // y
IL_001A: call UserQuery.Test
IL_001F: call System.Console.Write