How deterministic is floating point inaccuracy?

后端 未结 10 1489
长发绾君心
长发绾君心 2020-11-27 06:51

I understand that floating point calculations have accuracy issues and there are plenty of questions explaining why. My question is if I run the same calculation twice, can

10条回答
  •  日久生厌
    2020-11-27 07:21

    HM. Since the OP asked for C#:

    Is the C# bytecode JIT deterministic or does it generate different code between different runs? I don't know, but I wouldn't trust the Jit.

    I could think of scenarios where the JIT has some quality of service features and decides to spend less time on optimization because the CPU is doing heavy number crunching somewhere else (think background DVD encoding)? This could lead to subtle differences that may result in huge differences later on.

    Also if the JIT itself gets improved (maybe as part of a service pack maybe) the generated code will change for sure. The 80 bit internal precision issue has already been mentioned.

提交回复
热议问题