Fast Linux file count for a large number of files

后端 未结 17 2648
名媛妹妹
名媛妹妹 2020-12-22 17:21

I\'m trying to figure out the best way to find the number of files in a particular directory when there are a very large number of files (more than 100,000).

When the

17条回答
  •  感动是毒
    2020-12-22 17:47

    find, ls, and perl tested against 40,000 files has the same speed (though I didn't try to clear the cache):

    [user@server logs]$ time find . | wc -l
    42917
    
    real    0m0.054s
    user    0m0.018s
    sys     0m0.040s
    
    [user@server logs]$ time /bin/ls -f | wc -l
    42918
    
    real    0m0.059s
    user    0m0.027s
    sys     0m0.037s
    

    And with Perl's opendir and readdir, the same time:

    [user@server logs]$ time perl -e 'opendir D, "."; @files = readdir D; closedir D; print scalar(@files)."\n"'
    42918
    
    real    0m0.057s
    user    0m0.024s
    sys     0m0.033s
    

    Note: I used /bin/ls -f to make sure to bypass the alias option which might slow a little bit and -f to avoid file ordering. ls without -f is twice slower than find/perl except if ls is used with -f, it seems to be the same time:

    [user@server logs]$ time /bin/ls . | wc -l
    42916
    
    real    0m0.109s
    user    0m0.070s
    sys     0m0.044s
    

    I also would like to have some script to ask the file system directly without all the unnecessary information.

    The tests were based on the answers of Peter van der Heijden, glenn jackman, and mark4o.

提交回复
热议问题