python-gstreamer

Write image frames into gstreamer rtp pipeline

生来就可爱ヽ(ⅴ<●) 提交于 2021-01-29 17:38:24
问题 I am trying to use the gstreamer pipeline to view an rtp stream in vlc on my computer. I mostly looked into this thread. My end result is something like this #!/usr/bin/env python import gi import numpy as np gi.require_version('Gst', '1.0') from gi.repository import Gst, GObject import time class RtpPipeline(object): def __init__(self): self.number_frames = 0 self.fps = 30 self.duration = 1 / self.fps * Gst.SECOND # duration of a frame in nanoseconds self.launch_string = 'appsrc name=source

Custom Gstreamer plugin - To support multiple (2) source pad

人走茶凉 提交于 2021-01-29 17:32:15
问题 Hope all are doing great! I wanted to know your valuable inputs and suggestions on the Gstreamer pipeline requirement mentioned below. My requirement is to design the custom plugin which can support two different input frames to perform the custom operation on it. The one frame will be coming from the scalar element and the second frame from the decoder element . Is it feasible to design a custom GStreamer plugin to support multiple source pads? If yes then can anyone guide me on how do it is

Handling errors with gst-rtsp-server Python bindings

ⅰ亾dé卋堺 提交于 2021-01-29 11:30:41
问题 I have a simple Python program creates an RTSP stream using gst-rtsp-server. It works, but as-is there's no error handling. If the pipeline has a typo or there's some issue connecting to the video source, I don't see a stack trace or any logging. Where would I hook in code to handle problems like this? I should mention that I'm a complete beginner to the GObject world. I suspect there is a standard way for these libraries to report errors but I haven't been able to find anything in the

hls generated with gst (ts m3u8) not playing on safari (but working on chrome)

混江龙づ霸主 提交于 2020-12-15 05:34:42
问题 i'm trying to use gst to generate an hls video from frames within an existing pipeline. once i get the frame as a numpy array i use the following to create the ts and m3u8 file : appsrc emit-signals=True do-timestamp=true is-live=True caps={DEFAULT_CAPS}".format(**locals()) ! "queue" ! "videoconvert" ! "x264enc" ! "mpegtsmux" ! f"hlssink location={playlist}.%04d.ts " ! f"playlist-location={playlist}.m3u8"]) where default caps = "video/x-raw,format={VIDEO_FORMAT},width={WIDTH},height={HEIGHT}

Where are Gstreamer bus log messages?

落花浮王杯 提交于 2020-05-16 05:53:29
问题 I am trying to stream a .mp4 to a RTSP server using Gstreamer in python import sys import gi gi.require_version('Gst', '1.0') gi.require_version('GstRtspServer', '1.0') gi.require_version('GstRtsp', '1.0') from gi.repository import Gst, GstRtspServer, GObject, GLib, GstRtsp loop = GLib.MainLoop() Gst.init(None) file_path = "test.mp4" class TestRtspMediaFactory(GstRtspServer.RTSPMediaFactory): def __init__(self): GstRtspServer.RTSPMediaFactory.__init__(self) def do_create_element(self, url):

Video streaming over RTP using gstreamer

牧云@^-^@ 提交于 2020-01-06 07:24:42
问题 I am trying to stream a video file using gstreamer from one device to another over RTP. At the sender side I am using the following command : gst-launch filesrc location=/home/kuber/Desktop/MELT.MPG ! mpegparse ! rtpsend ip=localhost But this gives the following error : no element "rtpsend" , I downloaded all the rtp tools and still the same error. Am I using rtpsend in some wrong way? Also can someone give me the command line code for streaming video file(locally stored in my laptop and not

Video Transitions with GStreamer & GNonLin not working

可紊 提交于 2020-01-05 12:16:12
问题 I've been trying to combine 2 videos together with gstreamer with a short transition (like smpte) between them using gstreamer & gnonlin in python. However I can't get the gnloperation/smpte transition to work. Goal Below is a programme. I want it to play the first 4 sec of one file, and at 2 sec to start doing a smpte transition (that lasts for 2 seconds) to another file. (so the second file will start playing 2 seconds into the whole thing but be 'revealed' over the course of the 2 second

Sending eos in gstreamer after pre-defined time using new_single_shot_id

China☆狼群 提交于 2019-12-23 04:24:29
问题 I have a gstreamer application where I am creating a video with images. I need to create the video for a predefined time. I would like to send eos after the predefined time. I know that this can be achieved using new_single_shot_id in gstClock. But I could not find any example on how to use new_single_shot_id to create a trigger which is bound to a function that sends eos to pipeline. My simplified pipeline code is like this. class Main(object): def __init__(self, location): self.pipeline =

Get the window handle in PyGI

北慕城南 提交于 2019-12-19 11:28:42
问题 In my program I use PyGObject/PyGI and GStreamer to show a video in my GUI. The video is shown in a Gtk.DrawingArea and therefore I need to get it's window-handle in the realize -signal-handler. On Linux I get that handle using: drawing_area.get_property('window').get_xid() But how do I get the handle on Windows? I searched on the internet but found only examples for PyGtk using window.handle which does not work using PyGI. The GStreamer documentation provides an example which uses the GDK

Write opencv frames into gstreamer rtsp server pipeline

此生再无相见时 提交于 2019-12-17 15:46:23
问题 I'm trying to put opencv images into a gstreamer rtsp server in python. I have some issue writing in the mediafactory, I'm new to gst-rtsp-server ancd there's little documentation so I don't know exactly if I'm using the right approach. I'm using a thread to start the MainLoop and I'm using the main thread to create a buffer to push in the appsrc element of the mediafactory pipeline. Am I using the right approach to obtain my objective? Can anyone help me? My code is below: from threading