Paramiko channel stucks when reading large ouput

前端 未结 6 942
星月不相逢
星月不相逢 2020-12-09 05:41

I have a code where i am executing a command on remote Linux machine and reading the output using Paramiko. The code def looks like this:

ssh = paramiko.SSHC         


        
6条回答
  •  南方客
    南方客 (楼主)
    2020-12-09 06:38

    TL;DR: Call stdout.readlines() before stderr.readlines() if using ssh.exec_command()

    If you use @Spencer Rathbun's answer:

    sh = paramiko.SSHClient()
    ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
    ssh.connect(IPAddress, username=user['username'], password=user['password'])
    
    stdin, stdout, stderr = ssh.exec_command(cmd)
    

    You might want to be aware of the limitations that can arise from having large outputs.

    Experimentally, stdin, stdout, stderr = ssh.exec_command(cmd) will not be able to write the full output immediately to stdout and stderr. More specifically, a buffer appears to hold 2^21 (2,097,152) characters before filling up. If any buffer is full, exec_command will block on writing to that buffer, and will stay blocked until that buffer is emptied enough to continue. This means that if your stdout is too large, you'll hang on reading stderr, as you won't receive EOF in either buffer until it can write the full output.

    The easy way around this is the one Spencer uses - get all the normal output via stdout.readlines() before trying to read stderr. This will only fail if you have more than 2^21 characters in stderr, which is an acceptable limitation in my use case.

    I'm mainly posting this because I'm dumb and spent far, far too long trying to figure out how I broke my code, when the answer was that I was reading from stderr before stdout and my stdout was too big to fit in the buffer.

提交回复
热议问题