In both C (n1570 7.21.6.1/10) and C++ (by inclusion of the C standard library) it is undefined behavior to provide an argument to printf whose type does not match its conver
I'll start by asking you to be aware of the fact that long is 64-bit for 64-bit versions of OS X, Linux, the BSD clones, and various Unix flavors if you aren't already aware. 64-bit Windows, however, kept long as 32-bit.
What does this have to do with printf() and UB with respect to its conversion specifications?
Internally, printf() will use the va_arg() macro. If you use %ld on 64-bit Linux and only pass an int, the other 32 bits will be retrieved from adjacent memory. If you use %d and pass a long on 64-bit Linux, the other 32 bits will still be on the argument stack. In other words, the conversion specification indicates the type (int, long, whatever) to va_arg(), and the size of the corresponding type determines the number of bytes by which va_arg() adjusts its argument pointer. Whereas it will just work on Windows since sizeof(int)==sizeof(long), porting it to another 64-bit platform can cause trouble, especially when you have a int *nptr; and try to use %ld with *nptr. If you don't have access to the adjacent memory, you'll likely get a segfault. So the possible concrete cases are:
long and int are the same, so it just worksI'm not sure if alignment is an issue on some platforms, but if it is, it would depend upon the implementation of passing function parameters. Some "intelligent" compiler-specific printf() with a short argument list might bypass va_arg() altogether and represent the passed data as a string of bytes rather than working with a stack. If that happened, printf("%x %lx\n", LONG_MAX, INT_MIN); has three possibilities:
long and int are the same, so it just worksffffffff ffffffff80000000 is printedAs for why the C standard says that it causes undefined behavior, it doesn't specify exactly how va_arg() works, how function parameters are passed and represented in memory, or the explicit sizes of int, long, or other primitive data types because it doesn't unnecessarily constrain implementations. As a result, whatever happens is something the C standard cannot predict. Just looking at the examples above should be an indication of that fact, and I can't imagine what else other implementations exist that might behave differently altogether.