I\'ve been told that C types are machine dependent. Today I wanted to verify it.
void legacyTypes()
{
/* character types */
char k_char = \'a\';
There are a lot more platforms out there, and some of them are 16 or even 8 bit! On these, you would observe much bigger differences in the sizes of all the above types.
Signed and unsigned versions of the same basic type occupy the same number of bytes on any platform, however their range of numbers is different since for a signed number the same range of possible values is shared between the signed and unsigned realm.
E.g. a 16 bit signed int can have values from -32767 (or -32768 on many platforms) to 32767. An unsigned int of the same size is in the range 0 to 65535.
After this, hopefully you understand the point of the referred question better. Basically if you write a program assuming that e.g. your signed int variables will be able to hold the value 2*10^9 (2 billion), your program is not portable, because on some platforms (16 bits and below) this value will cause an overflow, resulting in silent and hard to find bugs. So e.g. on a 16 bit platform you need to #define
your ints to be long
in order to avoid overflow. This is a simple example, which may not work across all platforms, but I hope it gives you a basic idea.
The reason for all these differences between platforms is that by the time C got standardized, there was already many C compilers used on a plethora of different platforms, so for backward compatibility, all these varieties had to be accepted as valid.