I\'ve typed this into python shell:
>>> 0.1*0.1
0.010000000000000002
I expected that 0.1*0.1 is not 0.01, because I know that 0.1
from python tutorial:
In versions prior to Python 2.7 and Python 3.1, Python rounded this value to 17 significant digits, giving
‘0.10000000000000001’. In current versions, Python displays a value based on the shortest decimal fraction that rounds correctly back to the true binary value, resulting simply in‘0.1’.