I\'ve heard that size of data types such as int may vary across platforms.
My first question is: can someone bring some example, what goes wrong, when p
Compilers are responsible to obey the standard. When you include or they shall provide types according to standard size.
Compilers know they're compiling the code for what platform, then they can generate some internal macros or magics to build the suitable type. For example, a compiler on a 32-bit machine generates __32BIT__ macro, and previously it has these lines in the stdint header file:
#ifdef __32BIT__
typedef __int32_internal__ int32_t;
typedef __int64_internal__ int64_t;
...
#endif
and you can use it.