I\'ve heard that size of data types such as int
may vary across platforms.
My first question is: can someone bring some example, what goes wrong, when p
can someone bring some example, what goes wrong, when program assumes an int is 4 bytes, but on a different platform it is say 2 bytes?
Say you've designed your program to read 100,000 inputs, and you're counting it using an unsigned int
assuming a size of 32 bits (32-bit unsigned ints can count till 4,294,967,295). If you compile the code on a platform (or compiler) with 16-bit integers (16-bit unsigned ints can count only till 65,535) the value will wrap-around past 65535 due to the capacity and denote a wrong count.