C# loss of precision when dividing doubles

前端 未结 6 850
执笔经年
执笔经年 2021-01-12 07:05

I know this has been discussed time and time again, but I can\'t seem to get even the most simple example of a one-step division of doubles to result in the expected, unroun

6条回答
  •  盖世英雄少女心
    2021-01-12 07:29

    Is it not strange that two low-precision doubles like this can't divide to the correct value of 28?

    No, not really. Neither 0.7 nor 0.025 can be exactly represented in the double type. The exact values involved are:

    0.6999999999999999555910790149937383830547332763671875
    0.025000000000000001387778780781445675529539585113525390625
    

    Now are you surprised that the division doesn't give exactly 28? Garbage in, garbage out...

    As you say, the right result to represent decimal numbers exactly is to use decimal. If the rest of your program is using the wrong type, that just means you need to work out which is higher: the cost of getting the wrong answer, or the cost of changing the whole program.

提交回复
热议问题