Estimating error on calculations using decimals

前提是你 提交于 2020-01-02 08:49:58

问题


We're currently using System.Decimals to represent numbers in a .NET application we're developing. I know that decimals are design to minimize errors due to rounding, but I also know that certain numbers, 1/3 for example, cannot be represented as a decimal so some calculations will have small rounding error. I believe the magnitude of this error will be very small and insignificant, however a colleague disagrees. I would therefore like to be able to estimate the order of magnitude of the error due to rounding in our app. Say, for example, we are calculating a running total of “deals” and will do about 10,000 “deals” per day and there are about 5-10 decimal operations (add, sub, div, mul etc.) to calculate the running total new running total for each deal received, what would be the order of magnate of round error? An answer with a procedure for calculating this would also be nice, so I can learn how to do this for myself in the future.


回答1:


What Every Computer Scientist Should Know About Floating-Point Arithmetic goes into detail on estimating the error in the result of a sequence of floating point operations, given the precision of the floating point type. I haven't tried this on any practical program, though, so I'd be interested to know if it's feasible.



来源:https://stackoverflow.com/questions/3166851/estimating-error-on-calculations-using-decimals

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!