Paramiko Fails to download large files >1GB

前端 未结 7 1480
一向
一向 2020-12-13 10:44
def download():
if os.path.exists( dst_dir_path ) == False:
    logger.error( \"Cannot access destination folder %s. Please check path and permissions. \" % ( dst_di         


        
7条回答
  •  南方客
    南方客 (楼主)
    2020-12-13 11:23

    I had a very similar problem, in my case the file is only ~400 MB but it would consistently fail after downloading about 35 MB or so. It didn't always fail at the exact same number of bytes downloaded but somewhere around 35 - 40 MB the file would stop transferring and a minute or so later I would get the "There are insufficient resources to complete the request" error.

    Downloading the file via WinSCP or PSFTP worked fine.

    I tried Screwtape's method, and it did work but was painfully slow. My 400 MB file was on pace to take something like 4 hours to download, which was an unacceptable timeframe for this particular application.

    Also, at one time, when we first set this up, everything worked fine. But the server administrator made some changes to the SFTP server and that's when things broke. I'm not sure what the changes were, but since it still worked OK using WinSCP/other SFTP methods I didn't think it was going to be fruitful to try attacking this from the server side.

    I'm not going to pretend to understand why, but here's what ended up working for me:

    1. I downloaded and installed the current version of Paramiko (1.11.1 at this time). Initially this didn't make any difference at all but I figured I'd mention it just in case it was part of the solution.

    2. The stack trace for the exception was:

      File "C:\Python26\lib\site-packages\paramiko\sftp_client.py", line 676, in get
          size = self.getfo(remotepath, fl, callback)
      File "C:\Python26\lib\site-packages\paramiko\sftp_client.py", line 645, in getfo
          data = fr.read(32768)
      File "C:\Python26\lib\site-packages\paramiko\file.py", line 153, in read
          new_data = self._read(read_size)
      File "C:\Python26\lib\site-packages\paramiko\sftp_file.py", line 157, in _read
          data = self._read_prefetch(size)
      File "C:\Python26\lib\site-packages\paramiko\sftp_file.py", line 138, in _read_prefetch
          self._check_exception()
      File "C:\Python26\lib\site-packages\paramiko\sftp_file.py", line 483, in _check_exception
          raise x
      
    3. Poking around a bit in sftp_file.py, I noticed this (lines 43-45 in the current version):

      # Some sftp servers will choke if you send read/write requests larger than
      # this size.
      MAX_REQUEST_SIZE = 32768
      
    4. On a whim, I tried changing MAX_REQUEST_SIZE to 1024 and, lo and behold, I was able to download the whole file!

    5. After I got it to work by changing the MAX_REQUEST_SIZE to 1024, I tried a bunch of other values between 1024 and 32768 to see if it affected performance or anything. But I always got the error sooner or later when the value was significantly bigger then 1024 (1025 was OK, but 1048 eventually failed).

提交回复
热议问题