How does a C compiler interpret the \"L\" which denotes a long integer literal, in light of automatic conversion? The following code, when run on a 32-bit platform (32-bit long
The thing is that the rules of determining the type of the integral literal are different depending on whether you have a decimal number or a hexadecimal(or octal number). A decimal literal is always signed unless postfixes with U. A hexadecimal or octal literal can also be unsigned if the signed type can not contain the value.