I\'ve heard that size of data types such as int may vary across platforms.
My first question is: can someone bring some example, what goes wrong, when p
In earlier iterations of the C standard, you generally made your own typedef statements to ensure you got a (for example) 16-bit type, based on #define strings passed into the compiler for example:
gcc -DINT16_IS_LONG ...
Nowadays (C99 and above), there are specific types such as uint16_t, the exactly 16-bit wide unsigned integer.
Provided you include stdint.h, you get exact bit width types,at-least-that-width types, fastest types with a given minimum widthand so on, as documented in C99 7.18 Integer types . If an implementation has compatible types, they are required to provide these.
Also very useful is inttypes.h which adds some other neat features for format conversion of these new types (printf and scanf format strings).