I was going through Java\'s HashMap source code when I saw the following
//The default initial capacity - MUST be a power of two.
static final int DEFAULT_IN
The ideal situation is actually using prime number sizes for the backing array of an HashMap. That way your keys will be more naturally distributed across the array. However this works with mod division and that operation became slower and slower with every release of Java.
In a sense, the power of 2 approach is the worst table size you can imagine because with poor hashcode implementations are more likely to produce key collosions in the array.
Therefor you'll find another very important method in Java's HashMap implementation, which is the hash(int), that compensates for poor hashcodes.