Since i\'m working around time complexity, i\'ve been searching through the oracle Java class library for the time complexity of some standard methods used on Lists, Maps an
The source is often helpful: http://kickjava.com/src/java/util/HashMap.java.htm
remove:
O(1)size:
O(1)values:
O(n) (on traversal through iterator)The code for remove(as in rt.jar for HashMap) is:
/**
* Removes and returns the entry associated with the specified key
* in the HashMap. Returns null if the HashMap contains no mapping
* for this key.
*/
final Entry<K,V> removeEntryForKey(Object key) {
int hash = (key == null) ? 0 : hash(key.hashCode());
int i = indexFor(hash, table.length);
Entry<K,V> prev = table[i];
Entry<K,V> e = prev;
while (e != null) {
Entry<K,V> next = e.next;
Object k;
if (e.hash == hash &&
((k = e.key) == key || (key != null && key.equals(k)))) {
modCount++;
size--;
if (prev == e)
table[i] = next;
else
prev.next = next;
e.recordRemoval(this);
return e;
}
prev = e;
e = next;
}
return e;
}
Clearly, the worst case is O(n).
On an average the time complexity of a HashMap insertion, deletion, the search takes O(1)
constant time.
That said, in the worst case, java takes O(n)
time for searching, insertion, and deletion.
Mind you, the time complexity of HashMap apparently depends on the loadfactor n/b
(the number of entries present in the hash table BY the total number of buckets in the hashtable) and how efficiently the hash function maps each insert. By efficient I mean, a hash function might map two very different objects to the same bucket (this is called a collision) in case. There are various methods of solving collisions known as collision resolution technique such as
Java uses chaining and rehashing to handle collisions.
Chaining Drawbacks In the worst case, deletion and searching would take operation O(n)
. As it might happen all objects are mapped to a particular bucket, which eventually grows to the O(n)
chain.
Rehashing Drawbacks Java uses an efficient load factor(n/b)
of 0.75
as a rehashing limit (to my knowledge chaining apparently requires lookup operations on average O(1+(n/b))
. If n/b
< 0.99 with rehashing is used, it is constant time). Rehashing goes off-hand when the table is massive, and in this case, if we use it for real-time applications, response time could be problematic.
In the worst case, then, Java HashMap takes O(n)
time to search, insert, and delete.
Search: O(1+k/n)
Insert: O(1)
Delete: O(1+k/n)
where k is the no. of collision elements added to the same LinkedList (k elements had same hashCode)
Insertion is O(1) because you add the element right at the head of LinkedList.
Amortized Time complexities are close to O(1) given a good hashFunction. If you are too concerned about lookup time then try resolving the collisions using a BinarySearchTree instead of Default implementation of java i.e LinkedList
You can always take a look on the source code and check it yourself.
Anyway... I once checked the source code and what I remember is that there is a variable named size
that always hold the number of items in the HashMap
so size()
is O(1)
.