Limiting the max size of a HashMap in Java

六眼飞鱼酱① 提交于 2019-12-17 07:14:31

问题


I want to limit the maximum size of a HashMap to take metrics on a variety of hashing algorithms that I'm implementing. I looked at the loadfactor in one of HashMap's overloaded constructors.

HashMap(int initialCapacity, float loadFactor) 

I tried setting the loadFactor to 0.0f in the constructor (meaning that I don't want the HashMap to grow in size EVER) but javac calls this invalid:

Exception in thread "main" java.lang.IllegalArgumentException: Illegal load factor: 0.0
        at java.util.HashMap.<init>(HashMap.java:177)
        at hashtables.CustomHash.<init>(Main.java:20)
        at hashtables.Main.main(Main.java:70) Java Result: 1

Is there another way to limit the size of HashMap so it doesn't grow ever?


回答1:


Sometimes simpler is better.

public class InstrumentedHashMap<K, V> implements Map<K, V> {

    private Map<K, V> map;

    public InstrumentedHashMap() {
        map = new HashMap<K, V>();
    }

    public boolean put(K key, V value) {
        if (map.size() >= MAX && !map.containsKey(key)) {
             return false;
        } else {
             map.put(key, value);
             return true;
        }
    }

    ...
}



回答2:


You could create a new class like this to limit the size of a HashMap:

public class MaxSizeHashMap<K, V> extends LinkedHashMap<K, V> {
    private final int maxSize;

    public MaxSizeHashMap(int maxSize) {
        this.maxSize = maxSize;
    }

    @Override
    protected boolean removeEldestEntry(Map.Entry<K, V> eldest) {
        return size() > maxSize;
    }
}



回答3:


Simple solution is usually the best, so use unmodifiable or Immutable hashmap.

If you can not change amount of elements, then the size will be fixed - problem solved.




回答4:


public class Cache {
    private LinkedHashMap<String, String> Cache = null;
    private final int cacheSize;  
    private ReadWriteLock readWriteLock=null;
    public Cache(LinkedHashMap<String, String> psCacheMap, int size) {
        this.Cache = psCacheMap;
        cacheSize = size;
        readWriteLock=new ReentrantReadWriteLock();
    }

    public void put(String sql, String pstmt) throws SQLException{
        if(Cache.size() >= cacheSize && cacheSize > 0){
            String oldStmt=null;
            String oldSql = Cache.keySet().iterator().next();
            oldStmt = remove(oldSql);
            oldStmt.inCache(false);
            oldStmt.close();

        }
        Cache.put(sql, pstmt);
    }

    public String get(String sql){
        Lock readLock=readWriteLock.readLock();
        try{
            readLock.lock();
            return Cache.get(sql);
        }finally{
            readLock.unlock();
        }
    }

    public boolean containsKey(String sql){
        Lock readLock=readWriteLock.readLock();
        try{
            readLock.lock();
            return Cache.containsKey(sql);
        }finally{
            readLock.unlock();
        }
    }

    public String remove(String key){
        Lock writeLock=readWriteLock.writeLock();
        try{
            writeLock.lock();
            return Cache.remove(key);
        }finally{
            writeLock.unlock();
        }
    }

    public LinkedHashMap<String, String> getCache() {
        return Cache;
    }

    public void setCache(
            LinkedHashMap<String, String> Cache) {
        this.Cache = Cache;
    }


}



回答5:


The method put in the HashMap class is the one in charge of adding the elements into the HashMap and it does it by calling a method named addEntry which code is as follows:

   void addEntry(int hash, K key, V value, int bucketIndex) {
        Entry<K,V> e = table[bucketIndex];
        table[bucketIndex] = new Entry<K,V>(hash, key, value, e);
        if (size++ >= threshold)
            resize(2 * table.length);
    } 

As you can see in this method is where the HashMap is resized if the threshold has been exceeded, so I would try extending the class HashMap and writing my own methods for put and addEntry in order to remove the resizing. Something like:

package java.util;

public class MyHashMap<K, V> extends HashMap {


    private V myPutForNullKey(V value) {
        for (Entry<K, V> e = table[0]; e != null; e = e.next) {
            if (e.key == null) {
                V oldValue = e.value;
                e.value = value;
                e.recordAccess(this);
                return oldValue;
            }
        }
        modCount++;
        myAddEntry(0, null, value, 0);
        return null;
    }

    public V myPut(K key, V value) {
        if (key == null)
            return myPutForNullKey(value);
        if (size < table.length) { 
            int hash = hash(key.hashCode());
            int i = indexFor(hash, table.length);
            for (Entry<K, V> e = table[i]; e != null; e = e.next) {
                Object k;
                if (e.hash == hash && ((k = e.key) == key || key.equals(k))) {
                    V oldValue = e.value;
                    e.value = value;
                    e.recordAccess(this);
                    return oldValue;
                }
            }

            modCount++;
            myAddEntry(hash, key, value, i);
        }
        return null;
    }

    void myAddEntry(int hash, K key, V value, int bucketIndex) {
        Entry<K, V> e = table[bucketIndex];
        table[bucketIndex] = new Entry<K, V>(hash, key, value, e);
        size++;
    }
}

You would need to write your own methods since put and addEntry cannot be overriding and you would also need to do the same for putForNullKey since it is called inside put. A validation in put is required to verify that we are not trying to put an object if the table is full.



来源:https://stackoverflow.com/questions/5601333/limiting-the-max-size-of-a-hashmap-in-java

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!