I am just learning C and I have a little knowledge of Objective-C due to dabbling in iOS development, however, in Objective-C I was using NSLog(@\"%i\", x); to
I am just adding example here because I think examples make it easier to understand.
In printf() they behave identically so you can use any either %d or %i. But they behave differently in scanf().
For example:
int main()
{
int num,num2;
scanf("%d%i",&num,&num2);// reading num using %d and num2 using %i
printf("%d\t%d",num,num2);
return 0;
}
Output:

You can see the different results for identical inputs.
num:
We are reading num using %d so when we enter 010 it ignores the first 0 and treats it as decimal 10.
num2:
We are reading num2 using %i.
That means it will treat decimals, octals, and hexadecimals differently.
When it give num2 010 it sees the leading 0 and parses it as octal.
When we print it using %d it prints the decimal equivalent of octal 010 which is 8.