I read somewhere that int data type gives better performance (as compared to long and short) regardless of the OS as its size gets modified according to the word size of the OS. Where as long and short occupy 4 and 2 bytes which may or may not match with the word size of OS. Could anyone give a good explanation of this?
-
1Have a look at: [Performance of built-in types](http://stackoverflow.com/questions/5069489/performance-of-built-in-types-char-vs-short-vs-int-vs-float-vs-double) – Christian Ammer Jun 27 '11 at 11:18
-
1I've worked on some 8 bit microcontrollers where `char` type is preferred to use because `char` type is allocated as 8 bit on those platforms and 8 bit calculations are cheaper than `int`s, which are 16 bit. – Donotalo Jun 27 '11 at 11:26
-
1I would be interested if someone can tell why on a 64bit cpu (at least x86-64), gcc uses long to match the word size, and int has only 32 bits. – rafak Jun 27 '11 at 11:54
5 Answers
From the standard:
3.9.1, §2 :
There are five signed integer types : "signed char", "short int", "int", "long int", and "long long int". In this list, each type provides at least as much storage as those preceding it in the list. Plain ints have the natural size suggested by the architecture of the execution environment (44); the other signed integer types are provided to meet special needs.
So you can say char <= short <= int <= long <= long long.
But you cannot tell that a short is 2 byte and a long 4.
Now to your question, most compiler align the int to the register size of their target platform which make alignment easier and access on some platforms faster. But that does not mean that you should prefer int.
Take the data type according to your needs. Do not optimize without performance measure.

- 13,781
- 10
- 52
- 72
-
1Though you can tell that a short is at least 2 bytes, etc., as
conforms to ISO C's limits.h, which defines actual minimum limits (`Their implementation-defined values shall be equal or greater in magnitude (absolute value) to those shown, with the same sign`). – Sebastian Mach Jun 27 '11 at 11:25
int
is traditionally the most "natural" integral type for the machine on which the program is to run. What is meant by "most natural" is not too clear, but I would expect that it would not be slower than other types. More to the point, perhaps, is that there is an almost universal tradition for using int
in preference to other types when there is no strong reason for doing otherwise. Using other integral types will cause an experienced C++ programmer, on reading the code, to ask why.

- 150,581
- 18
- 184
- 329
-
1+1. I would also add that most of the time using `int` will cause an experienced C++ programmer, on reading the code, to ask why there *isn't* a reason to use any other particular type. Questioning why a type was chosen is a pretty common pastime of C++ programmers. – Steve Jessop Jun 27 '11 at 13:59
short only optimize storage size; calculations always extend to an int, if applicable (i.e. unless short is already same size)
not sure that int should be preferred to longs; the obvious case being when int's capacity doesn't suffice
You already mention native wordsize, so I'll leave that

- 374,641
- 47
- 450
- 633
Eskimos reportedly use forty or more different words for snow. When you only want to communicate that it's snow, then the word "snow" suffices. Source code is not about instructing the compiler: it's about communicating between humans, even if the communication may only be between your current and somewhat later self…
Cheers & hth.

- 142,714
- 15
- 209
- 331
-
3I can't agree here - source code is for writing programs and that includes instructing the compiler. – sharptooth Jun 27 '11 at 11:22
-
1@sharptooth: Instructing the compiler is just a very small part of it. If that was the main thing then you'd use assembly language. Instead we use languages such as C++ that let us abstract things, and the main activity is to choose good abstractions, abstractions that are easy to understand for others (or for oneself a bit later). Cheers & hth., :-) – Cheers and hth. - Alf Jun 27 '11 at 11:45
int
does not give better performance than the other types. Really, on most modern platforms, all of the integer types will perform similarly, excepting long long
. If you want the "fastest" integer available on your platform, C++ does not give you a way to do that.
On the other hand, if you're willing to use things defined by C99, you can use one of the "fastint" types defined there.
Also, on modern machines, memory hierarchy is more important than CPU calculations in most cases. Using smaller integer types lets you fit more integers into CPU cache, which will increase performance in pretty much all cases.
Finally, I would recommend not using int
as a default data type. Usually, I see people reach for int
when they really want an unsigned integer instead. The conversion from signed to unsigned can lead to subtle integer overflow bugs, which can lead to security vulnerabilities.
Don't choose the data type because of an intrinsic "speed" -- choose the right data type to solve the problem you're looking to solve.

- 104,103
- 58
- 317
- 552
-
1"If you want the "fastest" integer available on your platform, C++ does not give you a way to do that" but as I recall [stdint.h] does in C99 and C++0x. :-) – Cheers and hth. - Alf Jun 27 '11 at 11:48