C# loss of precision when dividing doubles

前端 未结 6 853
执笔经年
执笔经年 2021-01-12 07:05

I know this has been discussed time and time again, but I can\'t seem to get even the most simple example of a one-step division of doubles to result in the expected, unroun

6条回答
  •  醉酒成梦
    2021-01-12 07:33

    Precision is always a problem, in case you are dealing with float or double.

    Its a known issue in Computer Science and every programming language is affected by it. To minimize these sort of errors, which are mostly related to rounding, a complete field of Numerical Analysis is dedicated to it.

    For instance, let take the following code.

    What would you expect?

    You will expect the answer to be 1, but this is not the case, you will get 0.9999907.

            float v = .001f;            
            float sum = 0;
            for (int i = 0; i < 1000; i++ )
            {
                sum += v;
            }
    

提交回复
热议问题