Do redundant casts get optimized?

前端 未结 4 869
死守一世寂寞
死守一世寂寞 2020-12-30 18:35

I am updating some old code, and have found several instances where the same object is being cast repeatedly each time one of its properties or methods needs to be called. E

4条回答
  •  感情败类
    2020-12-30 19:34

    It is not optimized away from IL in either debug or release builds.

    simple C# test:

    using System;
    using System.Collections.Generic;
    using System.Linq;
    using System.Text;
    
    namespace RedundantCastTest
    {
        class Program
        {
            static object get()
            { return "asdf"; }
    
            static void Main(string[] args)
            {
                object obj = get();
                if ((string)obj == "asdf")
                    Console.WriteLine("Equal: {0}, len: {1}", obj, ((string)obj).Length);
            }
        }
    }
    

    Corresponding IL (note the multiple castclass instructions):

    .method private hidebysig static void Main(string[] args) cil managed
    {
        .entrypoint
        .maxstack 3
        .locals init (
            [0] object obj,
            [1] bool CS$4$0000)
        L_0000: nop 
        L_0001: call object RedundantCastTest.Program::get()
        L_0006: stloc.0 
        L_0007: ldloc.0 
        L_0008: castclass string
        L_000d: ldstr "asdf"
        L_0012: call bool [mscorlib]System.String::op_Equality(string, string)
        L_0017: ldc.i4.0 
        L_0018: ceq 
        L_001a: stloc.1 
        L_001b: ldloc.1 
        L_001c: brtrue.s L_003a
        L_001e: ldstr "Equal: {0}, len: {1}"
        L_0023: ldloc.0 
        L_0024: ldloc.0 
        L_0025: castclass string
        L_002a: callvirt instance int32 [mscorlib]System.String::get_Length()
        L_002f: box int32
        L_0034: call void [mscorlib]System.Console::WriteLine(string, object, object)
        L_0039: nop 
        L_003a: ret 
    }
    

    Neither is it optimized from the IL in the release build:

    .method private hidebysig static void Main(string[] args) cil managed
    {
        .entrypoint
        .maxstack 3
        .locals init (
            [0] object obj)
        L_0000: call object RedundantCastTest.Program::get()
        L_0005: stloc.0 
        L_0006: ldloc.0 
        L_0007: castclass string
        L_000c: ldstr "asdf"
        L_0011: call bool [mscorlib]System.String::op_Equality(string, string)
        L_0016: brfalse.s L_0033
        L_0018: ldstr "Equal: {0}, len: {1}"
        L_001d: ldloc.0 
        L_001e: ldloc.0 
        L_001f: castclass string
        L_0024: callvirt instance int32 [mscorlib]System.String::get_Length()
        L_0029: box int32
        L_002e: call void [mscorlib]System.Console::WriteLine(string, object, object)
        L_0033: ret 
    }
    

    Neither case means that the casts don't get optimized when native code is generated - you'd need to look at the actual machine assembly there. i.e. by running ngen and disassembling. I'd be greatly surprised if it wasn't optimized away.

    Regardless, I'll cite The Pragmatic Programmer and the broken window theorem: When you see a broken window, fix it.

提交回复
热议问题