Is it always true that long int (which as far as I understand is a synonym for long) is 4 bytes?
Can I rely on that? If no
When we first implemented C on ICL Series 39 hardware, we took the standard at its word and mapped the data types to the natural representation on that machine architecture, which was short = 32 bits, int = 64 bits, long = 128 bits.
But we found that no serious C applications worked; they all assumed the mapping short = 16, int = 32, long = 64, and we had to change the compiler to support that.
So whatever the official standard says, for many years everyone has converged on long = 64 bits and it's not likely to change.