I recently found this piece of JavaScript code:
Math.random() * 0x1000000 << 0
I understood that the first part was just generating a
Math.random()
returns a number between 0 (inclusive) and 1 (exclusive). Multiplying this number with a whole number results in a number that has decimal portion. The <<
operator is a shortcut for eliminating the decimal portion:
The operands of all bitwise operators are converted to signed 32-bit integers in big-endian order and in two's complement format.
The above statements means that the JavaScript engine will implicitly convert both operands of <<
operator to 32-bit integers; for numbers it does so by chopping off the fractional portion (numbers that do not fit 32-bit integer range loose more than just the decimal portion).
And is it just a nuance of JavaScript, or does it occur in other languages as well?
You'll notice similar behavior in loosely typed languages. PHP for example:
var_dump(1234.56789 << 0);
// int(1234)
For strongly types languages, the programs will usually refuse to compile. C# complains like this:
Console.Write(1234.56789 << 0);
// error CS0019: Operator '<<' cannot be applied to operands of type 'double' and 'int'
For these languages, you already have type-casting operators:
Console.Write((int)1234.56789);
// 1234