I\'m trying to produce a a float by dividing two int
s in my program. Here is what I\'d expect:
1 / 120 = 0.00833
Here is the code I\'m using:
In C (and therefore also in Objective-C), expressions are almost always evaluated without regard to the context in which they appear.
The expression 1 / 120
is a division of two int
operands, so it yields an int
result. Integer division truncates, so 1 / 120
yields 0
. The fact that the result is used to initialize a float
object doesn't change the way 1 / 120
is evaluated.
This can be counterintuitive at times, especially if you're accustomed to the way calculators generally work (they usually store all results in floating-point).
As the other answers have said, to get a result close to 0.00833 (which can't be represented exactly, BTW), you need to do a floating-point division rather than an integer division, by making one or both of the operands floating-point. If one operand is floating-point and the other is an integer, the integer operand is converted to floating-point first; there is no direct floating-point by integer division operation.
Note that, as @0x8badf00d's comment says, the result should be 0
. Something else must be going wrong for the printed result to be inf
. If you can show us more code, preferably a small complete program, we can help figure that out.
(There are languages in which integer division yields a floating-point result. Even in those languages, the evaluation isn't necessarily affected by its context. Python version 3 is one such language; C, Objective-C, and Python version 2 are not.)