What is the difference between conversion specifiers %i and %d in formatted IO functions (*printf / *scanf)
What is the difference between %d and %i when used as format specifiers in printf ? Dipstick They are the same when used for output, e.g. with printf . However, these are different when used as input specifier e.g. with scanf , where %d scans an integer as a signed decimal number, but %i defaults to decimal but also allows hexadecimal (if preceded by 0x ) and octal if preceded by 0 . So 033 would be 27 with %i but 33 with %d . These are identical for printf but different for scanf . For printf , both %d and %i designate a signed decimal integer. For scanf , %d and %i also means a signed