Is there a workaround for Java's poor performance on walking huge directories?

前端 未结 10 808
予麋鹿
予麋鹿 2020-12-02 23:30

I am trying to process files one at a time that are stored over a network. Reading the files is fast due to buffering is not the issue. The problem I have is just listing

10条回答
  •  情书的邮戳
    2020-12-03 00:10

    How about using File.list(FilenameFilter filter) method and implementing FilenameFilter.accept(File dir, String name) to process each file and return false.

    I ran this on Linux vm for directory with 10K+ files and it took <10 seconds.

    import java.io.File;  
    import java.io.FilenameFilter;
    
    public class Temp {
        private static void processFile(File dir, String name) {
            File file = new File(dir, name);
            System.out.println("processing file " + file.getName());
        }
    
        private static void forEachFile(File dir) {
            String [] ignore = dir.list(new FilenameFilter() {
                public boolean accept(File dir, String name) {
                    processFile(dir, name);
                    return false;
                }
            });
        }
    
        public static void main(String[] args) {
            long before, after;
            File dot = new File(".");
            before = System.currentTimeMillis();
            forEachFile(dot);
            after = System.currentTimeMillis();
            System.out.println("after call, delta is " + (after - before));
        }  
    }
    

提交回复
热议问题