I've heard that size of data types such as int
may vary across platforms.
My first question is: can someone bring some example, what goes wrong, when program
assumes an int
is 4 bytes, but on a different platform it is say 2 bytes?
Another question I had is related. I know people solve this issue with some typedefs
,
like you have variables like u8
,u16
,u32
- which are guaranteed to be 8bits, 16bits, 32bits, regardless of the platform -- my question is, how is this achieved usually? (I am not referring to types from stdint
library - I am curious manually, how can one enforce that some type is always say 32 bits regardless of the platform??)