gstreamer

pipeline Gstremer video streaming with delay

℡╲_俬逩灬. 提交于 2019-12-30 10:07:58
问题 Is it possible to give some delay in between before sending demuxed, h264-decoded output to autovideosink in gstreamer pipeline. If so can anybody post sample pipeline to do that. The pipeline which I used is udpsrc port=5000 ! mpegtsdemux name=demux ! queue ! ffdec_h264 ! ffmpegcolorspace ! autovideosink demux. ! queue ! ffdec_mp3 ! audioconvert ! alsasink demux In this case once the stream is received at upd-port 5000 it will immediately start playing after demuxing-queuing-decoding. Is

GStreamer: how to connect dynamic pads

橙三吉。 提交于 2019-12-30 06:47:15
问题 I'm trying to use GStreamer to play MP4 video from a file. I have managed to play the file using playbin2 and from the command prompt using: gst-launch filesrc location=bbb.mp4 ! decodebin2 ! autovideosink I am expecting in the future that I will need to create more complicated pipelines and hence why I'm attempting to 'program' the pipeline. In my program I am attempting to replicate the pipeline above, however I have an issue which I suspect is related to connecting the dynamic or

GStreamer error “assertion 'GST_IS_ELEMENT (src)' failed” when linking elements

走远了吗. 提交于 2019-12-29 09:04:19
问题 I'm working on a GStreamer-based program using Python and the GObject introspection bindings. I'm trying to build this pipeline: videomixer name=mix ! autovideosink \ uridecodebin uri=v4l2:///dev/video0 ! mix. The pipeline works perfectly using gst-launch-1.0, but my Python program gives the errors: (minimal.py:12168): GStreamer-CRITICAL **: gst_element_link_pads_full: assertion 'GST_IS_ELEMENT (src)' failed on_error(): (GError('Internal data flow error.',), 'gstbasesrc.c(2865): gst_base_src

gstreamer code for playing avi file is hanging

こ雲淡風輕ζ 提交于 2019-12-29 06:30:10
问题 I am new to gstremaer. I have written a code for playing avi file using gstreamer. But on executing the code it just hangs after a while, I am unable to debug whats the problem, Can someone help me please. The code and the output is as below: Code: #include<stdio.h> #include<gst/gst.h> #include<glib.h> //Function to process message on bus of pipeline gboolean process_message(GstBus *bus, GstMessage *msg,gpointer data); //Function to add pad dynamically for ogg demux void dynamic_addpad

GStreamer rtp stream to vlc

只谈情不闲聊 提交于 2019-12-28 03:35:26
问题 I'm having some trouble figuring out how to create a simple rtp stream with gstreamer and display it on vlc. I've installed GStreamer 0.10.30 and VLC 1.1.3. My only requirement is to use MPEG4 or H.264 codecs. Right now, I can stream the GStreamer videotestsrc through this simple pipeline: gst-launch videotestsrc ! ffenc_mpeg4 ! rtpmp4vpay ! udpsink host=127.0.0.1 port=5000 which outputs the "caps" needed by the client to receive the stream: /GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad

GStreamer pipeline to show an RTSP stream

送分小仙女□ 提交于 2019-12-25 09:28:06
问题 I am pretty new to Gstreamer. I need to write a video client able to stream data from an RTSP source using GStreamer. I configured VLC to stream a video I have on my laptop using RTSP and I want to create a pipeline to get that stream and show it. I tried using playbin and everything works fine. The point is that I need to fine tune the latency used to stream the video but it seems I cannot do that with playbin. I tried rtspsrc because it allows to work on the latency but I don't know how to

Gstreamer rtsp server linking in Qt Creator

别等时光非礼了梦想. 提交于 2019-12-25 08:48:38
问题 I've installed Gstreamer sdk and trying to compile this code: #include <gst/gst.h> #include <gst/rtsp-server/rtsp-server.h> int main (int argc, char *argv[]) { GMainLoop *loop; GstRTSPServer *server; GstRTSPMediaMapping *mapping; GstRTSPMediaFactory *factory; gst_init (&argc, &argv); loop = g_main_loop_new (NULL, FALSE); server = gst_rtsp_server_new (); mapping = gst_rtsp_server_get_media_mapping (server); factory = gst_rtsp_media_factory_new (); gst_rtsp_media_factory_set_launch (factory, "(

capture stream from raspberrry pi using gstreamer in opencv

血红的双手。 提交于 2019-12-25 08:07:49
问题 I'm trying to stream a video from a raspberry pi camera to my PC through a local network. On the RaspberryPi side, I use gstreamer with the following command: raspivid -n -t 0 -rot 270 -w 960 -h 720 -fps 30 -b 6000000 -o - | gst-launch-1.0 -e -vvvv fdsrc ! h264parse ! rtph264pay pt=96 config-interval=5 ! udpsink host=192.168.1.85 port=5000 And I use the following command on the PC side: gst-launch-1.0 -e -v udpsrc port=5000 ! application/x-rtp, payload=96 ! rtpjitterbuffer ! rtph264depay !

How can I stream mjpeg file as rtsp

て烟熏妆下的殇ゞ 提交于 2019-12-25 07:24:10
问题 We have an mjpeg video, obtained from the webcam and stored into *.avi file, still encoded as mjpeg. We need to restream this file as rtsp (and stil preserve the mjpeg there, i.e. no decoding). The goal is to emulate the webcam this video was obtained from for the software that processes the video. The file can be open with vlc/ffplay with no problems. The ffmpeg behaves like it is streaming it, however, ffplay/vlc can't open this stream. We tried to stream if with gstreamer. 1) we fount no

Dynamically re-sizing images in a GStreamer pipeline in python

我的梦境 提交于 2019-12-25 06:03:32
问题 I am trying to create a program to do various animations on different images simultaneously and one of the effects I am trying to achieve is zooming into a picture which is achieved by keeping base frame of a fixed size and image size to increase and decrease. But when I try to dynamically change the size of an image it causes error I tried searching in the web but couldn't find the right solution to it. Below is my code. Could anyone suggest me the right examples from which I can learn it