I recently did some profiling on some code and found that the largest CPU usage was being consumed by calls to BitConverter such as:
return BitConverter.ToInt16(new byte[] { byte1, byte2 });
when switching to something like:
return (short)(byte1 << 8 | byte2);
I noticed a huge improvement in performance.
My question is why is using BitConverter so much slower? I would have assumed that BitConverter was essentially doing the same kind of bit shifting internally.