Faster way to find large files with Python?

后端 未结 2 372
走了就别回头了
走了就别回头了 2021-01-16 20:20

I am trying to use Python to find a faster way to sift through a large directory(approx 1.1TB) containing around 9 other directories and finding files larger than, say, 200G

2条回答
  •  无人及你
    2021-01-16 21:00

    It is hard to imagine that you will find a significantly faster way to traverse a directory than os.walk() and du. Parallelizing the search might help a bit in some setups (e.g. SSD), but it won't make a dramatic difference.

    A simple approach to make things faster is by automatically running the script in the background every hour or so, and having your actual script just pick up the results. This won't help if the results need to be current, but might work for many monitoring setups.

提交回复
热议问题