In the accepted answer on my earlier question
( What is the fastest way to generate a random integer in javascript? ), I was wondering how a number loses its decimals via the symbol |
.
For example:
var x = 5.12042;
x = x|0;
How does that floor the number to 5?
Some more examples:
console.log( 104.249834 | 0 ); //104
console.log( 9.999999 | 0 ); // 9
Because, according to the ECMAScript specifications, bitwise operators operators call ToInt32 on each expression to be evaluated.
See 11.10 Binary Bitwise Operators:
The production
A : A @B, where@is one of the bitwise operators in the productions above, is evaluated as follows:
Evaluate
A.Call
GetValue(Result(1)).Evaluate
B.Call
GetValue(Result(3)).Call
ToInt32(Result(2)).Call
ToInt32(Result(4)).Apply the bitwise operator
@toResult(5)andResult(6). The result is a signed 32 bit integer.Return
Result(7).
Bitwise operators convert their arguments to integers (see http://es5.github.com/#x9.5). Most languages I know don't support this type of conversion:
$ python -c "1.0|0"
Traceback (most recent call last):
File "", line 1, in
TypeError: unsupported operand type(s) for |: 'float' and 'int'
$ ruby -e '1.0|0'
-e:1:in `': undefined method `|' for 1.0:Float (NoMethodError)
$ echo "int main(){1.0|0;}" | gcc -xc -
: In function ‘main’:
:1: error: invalid operands to binary | (have ‘double’ and ‘int’)
When doing a floor, although it would be possible to convert the argument to an integer, this is not what most languages would do because the original type is a floating-point number.
A better way to do it while preserving the data type is to go to exponent digits into the mantissa and zero the remaining bits.
If you're interested you can take a look at the IEEE spec for floating point numbers.
来源:https://stackoverflow.com/questions/9049677/how-does-x0-floor-the-number-in-javascript