Good morning, afternoon or night,
Forewarning: I know that this may sound like I am trying to do premature optimization, but I assure you I am not. That being, please answer this question if you would like to to the best of your ability without pointing me to links about "premature optimization".
Is it possible to notice any differences in the performance of massively repeated operations using constants written in base ten and in base sixteen? I see people writing
int b = a >> 0x1f;
a lot instead of
int b = a >> 31;
which suggests that either they come from a C/C++ background or that they may think operations perform faster (and I am not talking only about bitwise operations). Aren't all constants translated to the same base by the C# compiler? Or can there be any advantages in using base 16 over base 10 while writing code?
Thank you very much,