Read a file from server with SSH using Python

断了今生、忘了曾经 提交于 2019-11-27 00:31:24
Matt Good

Paramiko's SFTPClient class allows you to get a file-like object to read data from a remote file in a Pythonic way.

Assuming you have an open SSHClient:

sftp_client = ssh_client.open_sftp()
remote_file = sftp_client.open('remote_filename')
try:
    for line in remote_file:
        # process line
finally:
    remote_file.close()
jfs

Here's an extension to @Matt Good's answer:

from contextlib     import closing
from fabric.network import connect

with closing(connect(user, host, port)) as ssh, \
     closing(ssh.open_sftp()) as sftp, \
     closing(sftp.open('remote_filename')) as file:
    for line in file:
        process(line)
#!/usr/bin/env python
import paramiko
import select
client = paramiko.SSHClient()
client.load_system_host_keys()
client.connect('yourhost.com')
transport = client.get_transport()
channel = transport.open_session()
channel.exec_command("cat /path/to/your/file")
while True:
  rl, wl, xl = select.select([channel],[],[],0.0)
  if len(rl) > 0:
      # Must be stdout
      print channel.recv(1024)

What do you mean by "line by line" - there are lots of data buffers between network hosts, and none of them are line-oriented.

So you can read a bunch of data, then split it into lines at the near end.

ssh otherhost cat somefile | python process_standard_input.py | do_process_locally

Or you can have a process read a bunch of data at the far end, break it up, and format it line by line and send it to you.

scp process_standard_input.py otherhost
ssh otherhost python process_standard_input.py somefile |  do_process_locally

The only difference I would care about is what way reduces the volume of data over a limited network pipe. In your situation it may, or may not matter.

There is nothing wrong in general with using cat over an SSH pipe to move gigabytes of data.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!