C# loss of precision when dividing doubles

前端 未结 6 852
执笔经年
执笔经年 2021-01-12 07:05

I know this has been discussed time and time again, but I can\'t seem to get even the most simple example of a one-step division of doubles to result in the expected, unroun

6条回答
  •  感动是毒
    2021-01-12 07:37

    To explain this by analogy:

    Imagine that you are working in base 3. In base 3, 0.1 is (in decimal) 1/3, or 0.333333333'.

    So you can EXACTLY represent 1/3 (decimal) in base 3, but you get rounding errors when trying to express it in decimal.

    Well, you can get exactly the same thing with some decimal numbers: They can be exactly expressed in decimal, but they CAN'T be exactly expressed in binary; hence, you get rounding errors with them.

提交回复
热议问题