Since i\'m working around time complexity, i\'ve been searching through the oracle Java class library for the time complexity of some standard methods used on Lists, Maps an
On an average the time complexity of a HashMap insertion, deletion, the search takes O(1)
constant time.
That said, in the worst case, java takes O(n)
time for searching, insertion, and deletion.
Mind you, the time complexity of HashMap apparently depends on the loadfactor n/b
(the number of entries present in the hash table BY the total number of buckets in the hashtable) and how efficiently the hash function maps each insert. By efficient I mean, a hash function might map two very different objects to the same bucket (this is called a collision) in case. There are various methods of solving collisions known as collision resolution technique such as
Java uses chaining and rehashing to handle collisions.
Chaining Drawbacks In the worst case, deletion and searching would take operation O(n)
. As it might happen all objects are mapped to a particular bucket, which eventually grows to the O(n)
chain.
Rehashing Drawbacks Java uses an efficient load factor(n/b)
of 0.75
as a rehashing limit (to my knowledge chaining apparently requires lookup operations on average O(1+(n/b))
. If n/b
< 0.99 with rehashing is used, it is constant time). Rehashing goes off-hand when the table is massive, and in this case, if we use it for real-time applications, response time could be problematic.
In the worst case, then, Java HashMap takes O(n)
time to search, insert, and delete.