This is because your printf format specifier doesn't match what you passed it:
9/5 is of type int. But printf expects a float.
So you need to either cast it to a float or make either literal a float:
printf("%f\n", (float)9/5);
printf("%f\n", 9./5);
As for why you're getting 0.0, it's because the printf() is reading the binary representation of 1 (an integer) and printing it as a float. Which happens to be a small denormalized value that is very close to 0.0.
EDIT : There's also something going with type-promotion on varargs.
In vararg functions, float is promoted to double. So printf() in this case actually expects a 64-bit parameter holding a double. But you only passed it a 32-bit operand so it's actually reading an extra 32-bits from the stack (which happens to be zero in this case) - even more undefined behavior.