Python copy larger file too slow

后端 未结 2 638
后悔当初
后悔当初 2020-12-09 03:53

I am trying to copy a large file (> 1 GB) from hard disk to usb drive using shutil.copy. A simple script depicting what I am trying to do is:-

i         


        
2条回答
  •  失恋的感觉
    2020-12-09 04:23

    Just to add some interesting information: WIndows does not like the tiny buffer used on the internals of the shutil implementation.

    I've quick tried the following:

    • Copied the original shutil.py file to the example script folder and renamed it to myshutil.py
    • Changed the 1st line to import myshutil
    • Edited the myshutil.py file and changed the copyfileobj from

    def copyfileobj(fsrc, fdst, length=16*1024):

    to

    def copyfileobj(fsrc, fdst, length=16*1024*1024):

    Using a 16 MB buffer instead of 16 KB caused a huge performance improvement.

    Maybe Python needs some tuning targeting Windows internal filesystem characteristics?

    Edit:

    Came to a better solution here. At the start of your file, add the following:

    import shutil
    
    def _copyfileobj_patched(fsrc, fdst, length=16*1024*1024):
        """Patches shutil method to hugely improve copy speed"""
        while 1:
            buf = fsrc.read(length)
            if not buf:
                break
            fdst.write(buf)
    shutil.copyfileobj = _copyfileobj_patched
    

    This is a simple patch to the current implementation and worked flawlessly here.

    Python 3.8+: Python 3.8 includes a major overhaul of shutil, including increasing the windows buffer from 16KB to 1MB (still less than the 16MB suggested in this ticket). See ticket , diff

提交回复
热议问题