cv::imshow over SSH x11 too slow. alternatives?

纵然是瞬间 提交于 2021-02-15 07:07:53

问题


I have an application processing live frames from a camera that then spits out the result frame with imshow. When running on the local machine it all runs smoothly.

However, I need to call this application over SSH and still see the video somehow(doesn't need to be over ssh). Currently the application just works over X11 forwarding, but the frame rates are abysmal and theres a huge delay.

Is there a better way to do this that would minimize lag and latency?


回答1:


I tried several methods of improving the performance of OpenCV's imshow() over an ssh connection. I am using a Mac as my local dekstop and a Raspberry Pi4 as the remote machine that I ssh into and which is generating the images for display on my Mac.

Disabling compression on the ssh connection

I tried using:

ssh -o 'Compression no' pi4

and that resulted in around 6-8% throughput improvement, so it is hardly likely to be enough.

Using a lighter-weight encryption cipher

I read that arcfour uses less resources and allows you to achieve higher throughput. Unfortunately, I tried that and it is not available on Raspbian, nor are there any other lightweight ciphers, it seems. I used this command to check the available cipers on the Raspberry Pi:

ssh -Q cipher

Run X11 display outside of the ssh tunnel

As I failed to find a way to improve the ssh throughput, I decided to send the X11 unencrypted, rather than through the ssh tunnel. That meant I ran the following on the Mac to allow the X server to listen for direct connections and also to tell it to allow remote hosts to use it:

defaults write org.macosforge.xquartz.X11 nolisten_tcp -bool false
xhost +

Then I ssh into the Raspberry Pi without using ssh -X or ssh -Y and ran:

export DISPLAY=<MAC_IP_ADDRESS>:0
./opencvScript

That resulted in somewhat better performance, but it was still not ground-breaking.


I then decided to use Redis which is a very high performance, in-memory data-structure server. It is simple to install on a Mac or any other machine. It allows you to share atomic integers, strings, lists, hashes, queues, sets and ordered sets between any number of clients across a network. So, I simply JPEG-encode and send each frame of video from the Raspberry Pi to Redis and then pick the frames up on my Mac as fast as I can and display them. It works silky-smooth and does 256 frames of 640x480 colour video in around 3 seconds, so 80 fps.

So here is the DisplayServer I run on my Mac. It just grabs the latest image as fast as it can and displays it:

#!/usr/bin/env python3

import ImageTransferService
import numpy as np
import cv2

if __name__ == "__main__":

    host = '0.0.0.0'
    src = ImageTransferService.ImageTransferService(host)

    # Check Redis is running 
    print(src.ping())

    while True:
        im = src.receiveImage()
        cv2.imshow('Image',im)
        cv2.waitKey(1)

Here is the code I run on the Raspberry Pi that sends images:

#!/usr/local/bin/python3

import numpy as np
import ImageTransferService

if __name__ == "__main__":

    host = '192.168.0.8'
    RemoteDisplay = ImageTransferService.ImageTransferService(host)

    # Check remote display is up
    print(RemoteDisplay.ping())

    # Create BGR image
    w, h = 640, 480
    im = np.zeros((h,w,3),dtype=np.uint8)

    for c in range(256):
        im[:,:,0] = c
        RemoteDisplay.sendImage(im)

And here is the glue code that hides Redis and provides a way to send, receive and buffer images:

#!/usr/bin/env python3

import redis
import cv2
import numpy as np

class ImageTransferService:
    def __init__(self, host='localhost', port=6379):
        self.port = port
        self.host = host
        self.conn = redis.Redis(host,port)
        self.frameNum = 0

    def ping(self):
        return self.conn.ping()

    def sendImage(self,im, name='latest', Q=75):
        _, JPEG = cv2.imencode(".JPG", im, [int(cv2.IMWRITE_JPEG_QUALITY), Q])
        myDict = { 'frameNum': self.frameNum, 'Data':JPEG.tobytes() }
        self.conn.hmset(name, myDict)
        self.frameNum += 1

    def receiveImage(self,name='latest'):
        myDict  = self.conn.hgetall(name)
        Data = myDict.get(b'Data')
        im = cv2.imdecode(np.frombuffer(Data,dtype=np.uint8), cv2.IMREAD_COLOR)
        return im

As Redis has many bindings, you can access the frames of data and look into Redis or write stuff into Redis from PHP, Python, C/C++, PHP or even the command-line. So, the line below, when run in the shell, will grab the latest frame of your video, for example:

redis-cli hgetall latest

Here is a little video of the code in action, it simply increments the Blue channel from 0 through 255 to produce 255 frames. As you see, it goes from the Raspberry Pi (in the lower Terminal window) to the Mac (in the upper Terminal window running the DisplayServer code) in around 3s.


I also did a version using sockets to transfer the JPEG-encoded images and that was nearly as fast but somewhat uglier, and also less flexible and unbuffered and also not as controllable or debuggable from outside like Redis is.

Keywords: Raspberry Pi, OpenCV, display, remote display, X11, tunnel, ssh, DISPLAY, Redis, buffer, image, image processing, performance, XQuartz, listen, incoming, connections.



来源:https://stackoverflow.com/questions/57876639/cvimshow-over-ssh-x11-too-slow-alternatives

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!