C interpretation of hexadecimal long integer literal “L”
How does a C compiler interpret the "L" which denotes a long integer literal, in light of automatic conversion? The following code, when run on a 32-bit platform (32-bit long, 64-bit long long), seems to cast the expression "(0xffffffffL)" into the 64-bit integer 4294967295, not 32-bit -1. Sample code: #include <stdio.h> int main(void) { long long x = 10; long long y = (0xffffffffL); long long z = (long)(0xffffffffL); printf("long long x == %lld\n", x); printf("long long y == %lld\n", y); printf("long long z == %lld\n", z); printf("0xffffffffL == %ld\n", 0xffffffffL); if (x > (long)