I am wondering about the performance of different primitive types, primarily in C#. Now, I realize this is not strictly a language related concept, since the machine is optimized for handling types.
I have read the following two questions:
Nevertheless, I need a few clarifications.
I know that on a 32-bit machine an int
is faster than both short
and byte
, since int
is native for the platform. However, what happens on 64-bit systems? Is it better, performance wise, to use a long
instead of an int
?
Also, what happens with floating point types? Is double
better than float
?
The answer may or may not be language specific. I assume there aren't many differences for different languages regarding this issue. However, if there are, it would be nice to have an explanation of why.