Consider these definitions:
int x=5; int y=-5; unsigned int z=5;
How are they stored in memory? Can anybody explain the bit representation
Assuming int is a 16 bit integer (which depends on the C implementation, most are 32 bit nowadays) the bit representation differs like the following:
5 = 0000000000000101 -5 = 1111111111111011
if binary 1111111111111011 would be set to an unsigned int, it would be decimal 65531.