Math inaccuracy in Javascript: safe to use JS for important stuff?

我只是一个虾纸丫 提交于 2019-12-22 14:40:32

问题


I was bored, so I started fidlling around in the console, and stumbled onto this (ignore the syntax error):

Some variable "test" has a value, which I multiply by 10K, it suddenly changes into different number (you could call it a rounding error, but that depends on how much accuracy you need). I then multiply that number by 10, and it changes back/again.

That raises a few questions for me:
  • How in accurate is Javascript? Has this been determined? I.e. a number that can be taken into account?
  • Is there a way to fix this? I.e. to do math in Javascript with complete accuracy (within the limitations of its datatype).
  • Should the changed number after the second operation be interpreted as 'changing back to the original number' or 'changing again, because of the inaccuracy'?

I'm not sure whether this should be a separate question, but I was actually trying to round numbers to a certain amount after the decimal point. I've researched it a bit, and have found two methods:

 > Method A

function roundNumber(number, digits) {
    var multiple = Math.pow(10, digits);
    return Math.floor(number * multiple) / multiple;
}

 > Method B

function roundNumber(number, digits) {
    return Number(number.toFixed(digits));
}


Intuitively I like method B more (looks more efficient), but I don't know what going on behind the scenes so I can't really judge. Anyone have an idea on that? Or a way to benchmark this? And why is there no native round_to_this_many_decimals function? (one that returns an integer, not a string)


回答1:


How in accurate is Javascript?

Javascript uses standard double precision floating point numbers, so the precision limitations are the same as for any other language that uses them, which is most languages. It's the native format used by the processor to handle floating point numbers.

Is there a way to fix this? I.e. to do math in Javascript with complete accuracy (within the limitations of its datatype).

No. The precision limitations lies in the way that the number is stored. Floating point numbers doesn't have complete accuracy, so no matter how you do the calculations you can't achieve absolute accuracy as the result goes back into a floating point number.

If you want complete accuracy then you need to use a different data type.

Should the changed number after the second operation be interpreted as 'changing back to the original number' or 'changing again, because of the inaccuracy'?

It's changing again.

When a number is converted to text to be displayed, it's rounded to a certain number of digits. The numbers that look like they are exact aren't, it's just that the limitations in precision doesn't show up.

When the number "changes back" it's just because the rounding again hides the limitations in the precision. Each calculation adds or subtracts a small inaccuracy in the number, and sometimes it just happens to take the number closer to the number that you had originally. Eventhough it looks like it's more accurate, it's actually less accurate as each calculation adds a bit of uncertainty.




回答2:


Internally, JavaScript uses 64-bit IEEE 754 floating-point numbers, which are a widely used standard and usually guarantee about 16 digits of accuracy. The error you witnessesed was on the 17th significant digit of the number and was reeeally tiny.

Is there a way to [...] do math in Javascript with complete accuracy (within the limitations of its datatype).

I would say that JavaScript's math is completely accurate within the limitations of its datatype. The error you witnessed was outside of those limitations.

Are you working with calculations that require a higher degree of precision than that?

Should the changed number after the second operation be interpreted as 'changing back to the original number' or 'changing again, because of the inaccuracy'?

The number never really became more or less accurate than the original value. It was only when the value was converted into a decimal value that a rounding error became apparent. But this was not a case of the value "changing back" to an accurate number. The rounding error was just too small to display.

And why is there no native round_to_this_many_decimals function? (one that returns an integer, not a string)

"Why is the language this way" questions are not considered very productive here, but it is easy to get around this limitation (assuming you mean numbers and not integers). This answer has 337 upvotes: +numb.toFixed(digits);, but note that if you try to display a number produced with that expression, there's no guarantee that it will actually display with only six digits. That's probably one of the reasons why JavaScript's "round to N places" function produces a string and not a number.



来源:https://stackoverflow.com/questions/23500772/math-inaccuracy-in-javascript-safe-to-use-js-for-important-stuff

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!