For example, when I\'m dividing two ints and want a float returned, I superstitiously write something like this:
int a = 2, b = 3;
float c = (float)a / (floa
I think as long as you are casting just one of the two variables the compiler will behave properly (At least on the compilers that I know).
So all of:
float c = (float)a / b;
float c = a / (float)b;
float c = (float)a / (float)b;
will have the same result.
Having worked on safety-critical systems, i tend to be paranoid and always cast both factors: float(a)/float(b) - just in case some subtle gotcha is planning to bite me later. No matter how good the compiler is said to be, no matter how well-defined the details are in the official language specs. Paranoia: a programmer's best friend!
Then there are older brain-damaged types like me who, having to use old-fashioned languages, just unthinkingly write stuff like
int a;
int b;
float z;
z = a*1.0*b;
Of course this isn't universal, good only for pretty much just this case.