JavaScript Integer math incorrect results

左心房为你撑大大i 提交于 2019-11-30 05:31:06

Per the ECMAScript standard, all numbers in JavaScript are (64-bit IEEE 754) floating-point numbers.

However all 32-bit integers can be exactly represented as floating-point numbers. You can force a result to 32 bits by using the appropriate bitwise operator, like this:

x = (a * b) >>> 0;  // force to unsigned int32
x = (a * b) | 0;    // force to signed int32

Weird, but that's the standard.

(Incidentally this rounding behavior is one of the most frequently reported "bugs" against Firefox's JavaScript engine. Looks like it's been reported 3 times so far this year...)

As for reproducible random numbers in JavaScript, the V8 benchmark uses this:

// To make the benchmark results predictable, we replace Math.random
// with a 100% deterministic alternative.
Math.random = (function() {
  var seed = 49734321;
  return function() {
    // Robert Jenkins' 32 bit integer hash function.
    seed = ((seed + 0x7ed55d16) + (seed << 12))  & 0xffffffff;
    seed = ((seed ^ 0xc761c23c) ^ (seed >>> 19)) & 0xffffffff;
    seed = ((seed + 0x165667b1) + (seed << 5))   & 0xffffffff;
    seed = ((seed + 0xd3a2646c) ^ (seed << 9))   & 0xffffffff;
    seed = ((seed + 0xfd7046c5) + (seed << 3))   & 0xffffffff;
    seed = ((seed ^ 0xb55a4f09) ^ (seed >>> 16)) & 0xffffffff;
    return (seed & 0xfffffff) / 0x10000000;
  };
})();

When an integer in javascript is too big to fit in a 32 bit value, some browsers will convert it to a floating point. Since the value of floating points is only save to a limited precision, some rounding can occur on big values.

If done in C/C++ (double), the last numbers will be ...112 instead of 105 (which is correct). If performed with 'long double', the result will be as expected (...105). So it looks like the Javascript interpreter converts the numbers to 8-byte-double internally, does the calculation and does some unknown rounding which leads to a marginally better result than the C/C++ standard double calculation.

GCC 4.5:

 int main(int argc, char** argv)
{
 long double a = 119106029;
 long double b = 1103515245;
 long double c = a * b;
 printf("%.Lf\n", c);

 return 0;
}

Result:

131435318772912105

Expected:

131435318772912105

So I don't see a chance in Javascript without the aid of a BIGNUM library (if any).

Regards

rbo

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!