Load a lot of properties file from a single text file and insert into LinkedHashMap

淺唱寂寞╮ 提交于 2019-12-13 02:47:22

问题


I have a file which contains lots of properties file line by line may be around 1000 and each properties file will be having around 5000 key-value pair. For eg:- Sample Example(abc.txt)-

abc1.properties
abc2.properties
abc3.properties
abc4.properties
abc5.properties

So I am opening this file and as it reads each line I am loading the properties file in loadProperties method. And storing key-value pair from that property in LinkedHashMap.

public class Project {
    public static HashMap<String, String> hashMap;

    public static void main(String[] args) {
        BufferedReader br = null;
        hashMap = new LinkedHashMap<String, String>();
        try {
            br = new BufferedReader(new FileReader("C:\\apps\\apache\\tomcat7\\webapps\\examples\\WEB-INF\\classes\\abc.txt"));
            String line = null;
            while ((line = br.readLine()) != null) {
                loadProperties(line);//loads abc1.properties first time
            }
        } catch (FileNotFoundException e1) {
            e1.printStackTrace();
        }
        catch (IOException e) {
            e.printStackTrace();
        } finally {
            try {
                br.close();
            } catch (IOException e) {
                e.printStackTrace();
            }
        }
    }
//I am loading each property file in this method. And checking whether the key 
already exists in the hashMap if it exists in the hashMap then concatenate the
new key value  with the previous key value. And keep on doing everytime you 
find key exists.

    private static void loadProperties(String line) {
        Properties prop = new Properties();
        InputStream in = Project.class.getResourceAsStream(line);
        String value = null;
        try {
            prop.load(in);
            for(Object str: prop.keySet()) {
                if(hashMap.containsKey(str.toString())) {
                    StringBuilder sb = new StringBuilder().append(hashMap.get(str)).append("-").append(prop.getProperty((String) str));
                    hashMap.put(str.toString(), sb.toString());
                } else {
                    value = prop.getProperty((String) str);
                    hashMap.put(str.toString(), value);
                    System.out.println(str+" - "+value);
                }
            }
        } catch (IOException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        } finally {
            try {
                in.close();
            } catch (IOException e) {
                // TODO Auto-generated catch block
                e.printStackTrace();
            }
        }


    }

}

So My Question is as I am having more than 1000 properties file and each properties file is having more than 5000 key-value pair. And most of the property file have the same key but with different values so I have to concatenate the value with the previous value if the key is same. So is there any limitation on the size with the LinkedHashMap as the property file keep on increasing and also the key-value pair in properties file. So this code is optimized enough to handle this kind of problem?


回答1:


Map does not have any limitations except size of your memory heap that you allocated for your JVM and can control using option -Xmx

Your code is OK from performance perspective.

But I can suggest the following improvements.

  1. Avoid using hashMap.containsKey(str.toString()) and then hashMap.get(str). containsKey(key) is implemented as return get(key) != null, so you actually call get() twice. You can say something like the following instead:

    value = map.get(key); if (value != null) { value += str; } map.put(key, value);

  2. Do not call str.toString(). This call just create yet another String instance equal to the original one. Since Properties class is not parametrized use casting instead i.e. (String)str.

  3. If you still have performance problem you can merge all properties files first and then load them as using Properties.load() once. Probably you will get some performance benefits.



来源:https://stackoverflow.com/questions/8208441/load-a-lot-of-properties-file-from-a-single-text-file-and-insert-into-linkedhash

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!