Rehashing process in hashmap or hashtable

前端 未结 3 1501
南方客
南方客 2020-12-04 16:53

How is the rehashing process done in a hashmap or hashtable when the size exceeds the maxthreshold value?

Are all pairs just copied to a new array of buckets?

<
3条回答
  •  陌清茗
    陌清茗 (楼主)
    2020-12-04 17:37

    Hashing – Rehashing and Race condition

    Basically while creating a hash map, collection assigns it a default capacity (of 2^4 i.e 16.). Later stage when elements are added in the map and after a certain stage when you come close to your initial defined capacity there is a requirement of ReHashing to retain the performance.

    There is LoadFactor defined for the collection (said to be good as .75) and this specifies the good index for time and space.

    • LARGER load factor => lower space consumption but higher lookups
    • SMALLER Load factor => Larger space consumption compared to the required no of elements.

    Java specification suggests that Good load factor value is .75

    Hence Suppose you have a maximum requirement to store 10 elements in hash then considering the Good Loadfactor .75 = Rehashing would occur after adding 7 elements in the collection. In case if your requirement, in this case, would not accede to 7 then Rehashing would never occur.

    If there are really large no of elements going to be stored in the hashmap then it is always good to create HashMap with sufficient capacity; this is more efficient than letting it to perform automatic rehashing.

    RACE condition : While doing the rehashing internal elements which are stored in a linked list for given bucket. They get reverse in the order. Suppose there are two threads encounter the race condition in same time then there are chances of second therad can go in infinite loop while traversal since the order has been changed.

提交回复
热议问题