Why 0.1 + 0.1 == 0.2?

。_饼干妹妹 提交于 2019-12-10 22:09:25

问题


This is concerning Java. From what I've understood, 0.1 cannot be perfectly represented by Java because of binary representations. That makes

0.1 + 0.1 + 0.1 == 0.3

false. However, why does

0.1 + 0.1 == 0.2

gives true?


回答1:


0.1 cannot be perfectly represented by Java because of binary representations. That makes

0.1 + 0.1 + 0.1 == 0.3

false.

That is not the entire reason why the equality is false, although it is part of it. 0.3 is not exactly 3/10 either. It so happens that 0.2 is exactly twice 0.1 (although they are not respectively 2/10 and 1/10), and that adding 0.1 to itself produces the value that is also the one you get when you type the constant 0.2. On the other hand, the overall approximation that you get after the operations 0.1 + 0.1 + 0.1 is slightly different of the approximation separating 0.3 from 3/10.

If we were using decimal with 5 significant digits, you might be surprised that 1 / 3 * 3 == 1 does not hold (1 / 3 would compute as 0.33333 and that times 3 would compute as 0.99999, which is different from 1), whereas 1 / 4 * 4 == 1 does hold (1 / 4 would compute as 0.25, and that times 4 would compute as 1).

Your question is somewhat similar to this, but for base-2 computations. Every constant and operation is an opportunity for an approximation. Sometimes the approximations do not happen, and sometimes they happen but cancel out, so that the end result is more accurate than you had a right to expect. In the case of 0.1 + 0.1, the result is not 2/10, but it is the same approximation of 2/10 that you get when you write 0.2, so that the equality holds. With 0.1 + 0.1 + 0.1 we happen not to be so lucky.



来源:https://stackoverflow.com/questions/47485048/why-0-1-0-1-0-2

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!