I am trying to read a file from a server using SSH from Python. I am using Paramiko to connect. I can connect to the server and run a command like cat filename
Here's an extension to @Matt Good's answer, using fabric:
from fabric.connection import Connection
with Connection(host, user) as c, c.sftp() as sftp, \
sftp.open('remote_filename') as file:
for line in file:
process(line)
old Fabric 1 answer:
from contextlib import closing
from fabric.network import connect
with closing(connect(user, host, port)) as ssh, \
closing(ssh.open_sftp()) as sftp, \
closing(sftp.open('remote_filename')) as file:
for line in file:
process(line)
It looks like back in Sept 2013 paramiko added the ability for these objects to support context managers natively, so if you want both Matt's clean answer with jfs's context manager, now all you need is:
with ssh_client.open_sftp() as sftp_client:
with sftp_client.open('remote_filename') as remote_file:
for line in remote_file:
# process line
#!/usr/bin/env python
import paramiko
import select
client = paramiko.SSHClient()
client.load_system_host_keys()
client.connect('yourhost.com')
transport = client.get_transport()
channel = transport.open_session()
channel.exec_command("cat /path/to/your/file")
while True:
rl, wl, xl = select.select([channel],[],[],0.0)
if len(rl) > 0:
# Must be stdout
print channel.recv(1024)
What do you mean by "line by line" - there are lots of data buffers between network hosts, and none of them are line-oriented.
So you can read a bunch of data, then split it into lines at the near end.
ssh otherhost cat somefile | python process_standard_input.py | do_process_locally
Or you can have a process read a bunch of data at the far end, break it up, and format it line by line and send it to you.
scp process_standard_input.py otherhost
ssh otherhost python process_standard_input.py somefile | do_process_locally
The only difference I would care about is what way reduces the volume of data over a limited network pipe. In your situation it may, or may not matter.
There is nothing wrong in general with using cat
over an SSH pipe to move gigabytes of data.
Paramiko's SFTPClient class allows you to get a file-like object to read data from a remote file in a Pythonic way.
Assuming you have an open SSHClient
:
sftp_client = ssh_client.open_sftp()
remote_file = sftp_client.open('remote_filename')
try:
for line in remote_file:
# process line
finally:
remote_file.close()