I was trying to normalize a set of numbers from -100 to 0 to a range of 10-100 and was having problems only to notice that even with no variables at all, this does not evalu
It has to do with the version of python that you use. Basically it adopts the C behavior: if you divide two integers, the results will be rounded down to an integer. Also keep in mind that Python does the operations from left to right, which plays a role when you typecast.
Example: Since this is a question that always pops in my head when I am doing arithmetic operations (should I convert to float and which number), an example from that aspect is presented:
>>> a = 1/2/3/4/5/4/3
>>> a
0
When we divide integers, not surprisingly it gets lower rounded.
>>> a = 1/2/3/4/5/4/float(3)
>>> a
0.0
If we typecast the last integer to float, we will still get zero, since by the time our number gets divided by the float has already become 0 because of the integer division.
>>> a = 1/2/3/float(4)/5/4/3
>>> a
0.0
Same scenario as above but shifting the float typecast a little closer to the left side.
>>> a = float(1)/2/3/4/5/4/3
>>> a
0.0006944444444444445
Finally, when we typecast the first integer to float, the result is the desired one, since beginning from the first division, i.e. the leftmost one, we use floats.
Extra 1: If you are trying to answer that to improve arithmetic evaluation, you should check this
Extra 2: Please be careful of the following scenario:
>>> a = float(1/2/3/4/5/4/3)
>>> a
0.0