4

I have ever read that int32_t is exact 32 bits long and int_least32_t only at least 32 bits, but they have both the same typedefs in my stdint.h:

typedef int  int_least32_t;

and

typedef int  int32_t;

So where is the difference? They exactly the same...

Sebi2020
  • 1,966
  • 1
  • 23
  • 40

2 Answers2

2

int32_t is signed integer type with width of exactly 32 bits with no padding bits and using 2's complement for negative values. int_least32_t is smallest signed integer type with width of at least 32 bits. These are provided only if the implementation directly supports the type.

The typedefs that you are seeing simply means that in your environment both these requirements are satisfied by the standard int type itself. This need not mean that these typedefs are the same on a different environment.

jester
  • 3,491
  • 19
  • 30
0

Why do you think that on another computer with different processor, different OS, different version of C standard libs you will see exactly this typedefs?

This 2 types are exactly what you wrote. One of them is 32 bits exactly, another type is at least 32 bit. So one of the possible situations is when both of them are 32 bits and on your particular case you see it in stdint.h. On another system you may see that they are different.

JustAnotherCurious
  • 2,208
  • 15
  • 32
  • Because I think if I install the gcc I get the same Includes and I thought the stdlib is for every 32 Bit distribution of gcc the same on a x86 Computer. And why these types are different on other machines? I thought if I declare a var as int it is exactly 32 Bits long. And I always thought it relies on the gcc build how long a "builtin" type is. – Sebi2020 Dec 04 '13 at 20:37
  • 2
    @Sebi2020 C is cross-platform high-level language that try to get as close as possible to low-level languages. So the standard does not provide the size of basic types and allows the maintainers of the compiler to select best representation for the basic types. Long story short they require only the relation: |char| <= |int| <= |long int|. The maintainers of the compiler define the types for best architecture compatibility. For example on my x86_64 machine long int is 64 bits and on x86 machines the long int is usually 32 bits. – JustAnotherCurious Dec 05 '13 at 04:42