Is there a way to efficiently yield every file in a directory containing millions of files?

前端 未结 6 1679
萌比男神i
萌比男神i 2020-12-01 18:53

I\'m aware of os.listdir, but as far as I can gather, that gets all the filenames in a directory into memory, and then returns the list. What I want, is a way t

6条回答
  •  没有蜡笔的小新
    2020-12-01 19:23

    The glob module Python from 2.5 onwards has an iglob method which returns an iterator. An iterator is exactly for the purposes of not storing huge values in memory.

    glob.iglob(pathname)
    Return an iterator which yields the same values as glob() without
    actually storing them all simultaneously.
    

    For example:

    import glob
    for eachfile in glob.iglob('*'):
        # act upon eachfile
    

提交回复
热议问题