Read Frames from RTSP Stream in Python

前端 未结 6 467
旧巷少年郎
旧巷少年郎 2020-12-13 16:12

I have recently set up a Raspberry Pi camera and am streaming the frames over RTSP. While it may not be completely necessary, here is the command I am using the broadcast th

6条回答
  •  鱼传尺愫
    2020-12-13 16:29

    Here is yet one more option

    It's much more complicated than the other answers. :-O

    But this way, with just one connection to the camera, you could "fork" the same stream simultaneously to several multiprocesses, to the screen, recast it into multicast, write it to disk, etc.

    .. of course, just in the case you would need something like that (otherwise you'd prefer the earlier answers)

    Let's create two independent python programs:

    (1) Server program (rtsp connection, decoding) server.py

    (2) Client program (reads frames from shared memory) client.py

    Server must be started before the client, i.e.

    python3 server.py
    

    And then in another terminal:

    python3 client.py
    

    Here is the code:

    (1) server.py

    import time
    from valkka.core import *
    
    # YUV => RGB interpolation to the small size is done each 1000 milliseconds and passed on to the shmem ringbuffer
    image_interval=1000  
    # define rgb image dimensions
    width  =1920//4
    height =1080//4
    # posix shared memory: identification tag and size of the ring buffer
    shmem_name    ="cam_example" 
    shmem_buffers =10 
    
    shmem_filter    =RGBShmemFrameFilter(shmem_name, shmem_buffers, width, height)
    sws_filter      =SwScaleFrameFilter("sws_filter", width, height, shmem_filter)
    interval_filter =TimeIntervalFrameFilter("interval_filter", image_interval, sws_filter)
    
    avthread        =AVThread("avthread",interval_filter)
    av_in_filter    =avthread.getFrameFilter()
    livethread      =LiveThread("livethread")
    
    ctx =LiveConnectionContext(LiveConnectionType_rtsp, "rtsp://user:password@192.168.x.x", 1, av_in_filter)
    
    avthread.startCall()
    livethread.startCall()
    
    avthread.decodingOnCall()
    livethread.registerStreamCall(ctx)
    livethread.playStreamCall(ctx)
    
    # all those threads are written in cpp and they are running in the
    # background.  Sleep for 20 seconds - or do something else while
    # the cpp threads are running and streaming video
    time.sleep(20)
    
    # stop threads
    livethread.stopCall()
    avthread.stopCall()
    
    print("bye") 
    

    (2) client.py

    import cv2
    from valkka.api2 import ShmemRGBClient
    
    width  =1920//4
    height =1080//4
    
    # This identifies posix shared memory - must be same as in the server side
    shmem_name    ="cam_example"
    # Size of the shmem ringbuffer - must be same as in the server side
    shmem_buffers =10              
    
    client=ShmemRGBClient(
    name          =shmem_name,
    n_ringbuffer  =shmem_buffers,
    width         =width,
    height        =height,
    mstimeout     =1000,        # client timeouts if nothing has been received in 1000 milliseconds
    verbose       =False
    ) 
    
    while True:
    index, isize = client.pull()
    if (index==None):
        print("timeout")
    else:
        data =client.shmem_list[index][0:isize]
        img =data.reshape((height,width,3))
        img =cv2.GaussianBlur(img, (21, 21), 0)
        cv2.imshow("valkka_opencv_demo",img)
        cv2.waitKey(1)
    

    If you got interested, check out some more in https://elsampsa.github.io/valkka-examples/

提交回复
热议问题