0

Sometimes it is necessary to use awkward/implausible constucts in one's code in order to lend a helping hand to retarded JIT compilers, since the powers that be in their wisdom decided that programs should be jitted before each use instead being run through an optimising compiler once at build time or at installation time.

Are there any existing conventions for tagging these awkward/implausible constructs, analogous to the /*FALLTHRU*/ thing that hearkens back to lint?

These constructs have to be tagged for similar reasons that fall-through cases in switch statements (in C/C++) need to be tagged with /*FALLTHRU*/, less for the benefit of automated tools but more to protect them from well-meaning but short-sighted maintenance programmers (could be ourselves in a few weeks/months/years' time). Another benefit of tagging would be tuning one's builtin performance sense over time upon seeing enough properly tagged code. Adding warnings in prose would create more noise and it wouldn't be quite as effective...

A classic example that requires tagging would be:

class Foo
{
    int[] m_a;

    public void f ()
    {
        var a = m_a;
        int n = a.Length;

        // ...

        for (int i = 0; i < a.Length; ++i)
            a[i] = ...
    }
}

Some of the MS jitters are able to create and manage the local shadow variable for the member array on their own, but none does so reliably all of the time. Also, at least some of them eliminate the bounds check in the loop body only if the loop test is against the .Length member of the array being accessed in the loop body but not if the test is against a cached value (n in the example). See also the topic Array bounds check efficiency in .net 4 and above.

Hence the shadow variable a needs to be tagged as 'necessity proven by benchmark, not premature optimisation based on prejudice', and the use of a.Length instead of n needs to be tagged as 'don't touch this lest the wrath of $me be upon you'.

The reason I'm sensitive to this is that I can often squeeze enough performance out of tuned safe/managed C# so that I don't have to allow unsafe code or fart around with DLLs written in C++. However, these tunings are somewhat fragile. Also, creeping losses of 5% performance here and 10% there can be difficult to detect, but in my image processing code they can add up to many minutes or even hours over a workday. The difficulty of detecting such regressions is made worse by the fact that the quality of the input images has a tendency to drift, and it can vary considerably from one client to the next and also from one day to the next for the same client.

Hence the explicit tagging as an insurance measure against unnecessary performance loss (there are other measures but they aren't germane to the tagging question).

P.S.: there are other areas where tagging is required, such as the use of signed integers instead of unsigned ones because newer (!) jitters cannot turn unsigned division by constants into multiplication with the inverse as they do with signed integers. Some tight loops can become several times as fast if signed integers are used even if logic would demand unsigned integers and even though the signed semantics in general tend to require a certain overhead compared to unsigned math.

Community
  • 1
  • 1
DarthGizka
  • 4,347
  • 1
  • 24
  • 36
  • *"... instead being run through an optimising compiler once at build time or at installation time"*, You can do that, just add a [`ngen.exe install `](https://msdn.microsoft.com/en-us/library/6t9t5wcf(v=vs.110).aspx) to the list of actions your installer performs to generate the Jitted assemblies at install time. – Scott Chamberlain Sep 25 '16 at 15:18
  • @Scott: `ngen` jits ahead of time but it cannot replace a real compiler of the calibre of `gcc` or Microsoft's own VC++. A couple of the optimisation time budget cuts may not apply when ngen'ing compared to on-demand jitting but on the whole the picture doesn't change. Most notably you don't get to the point where the source code becomes more of a logical description (contract with the compiler) rather than a compact physical description of the machine code to be generated as with dumb compilers. g++ and VC++ have been there for quite some time already. C# still needs a lot of hand-holding. – DarthGizka Sep 25 '16 at 15:58
  • [.Net Native](http://blogs.microsoft.co.il/sasha/2014/04/28/net-native-performance-internals/) looks promising - it's something to watch. Currently it appears to be limited to UWP, though. – DarthGizka Sep 25 '16 at 16:40

0 Answers0