I wrote a simple benchmark in order to find out if bounds check can be eliminated when the array gets computed via bitwise and. This is basically what nearly all hash tables
In order to safely eliminate that bounds check, it is necessary to prove that
h & (table.length - 1)
is guaranteed to produce a valid index into table
. It won't if table.length
is zero (as you'll end up with & -1
, an effective-noop). It also won't usefully do it if table.length
is not a power of 2 (you'll lose information; consider the case where table.length
is 17).
How can the HotSpot compiler know that these bad conditions are not true? It has to be more conservative than a programmer can be, as the programmer can know more about the high-level constraints on the system (e.g., that the array is never empty and always as a number of elements that is a power-of-two).