gstreamer

How can I fix synchronization problem in gstreamer?

╄→гoц情女王★ 提交于 2019-12-02 08:15:37
In my code when I open the 2nd vlc 1st one starts from the beginning with 2nd one. But it must be like that: While the first client is still running, the second/third/etc client should start at approximately the same position as the first client currently has. What should I add to my code for that problem. I am working with Windows VisualStudio. #include "pch.h" #include <iostream> #include <gst/gst.h> #include <gst/rtsp-server/rtsp-media.h> #include <gst/rtsp-server/rtsp-server.h> #include <gst/rtsp-server/rtsp-media-factory-uri.h> #define PORT "8554" static char *port = (char *)PORT; static

使用Gstreamer处理RTSP视频流

痴心易碎 提交于 2019-12-02 05:54:10
文章目录 RTSP视频流处理方法 1. Gstreamer整体框架 1.1 Media Applications 1.2 Core Framework 1.3 Plugins 2. Gstreamer组件 2.1 Element 2.2 Pad 2.3 Bin和Pipeline 3. gstreamer tools 3.1 gst-inspect-1.0 3.2 gst-launch-1.0 4. 参考链接 RTSP视频流处理方法 这里使用Gstreamer + OpenCV来处理RTSP视频流,因此对Gstreamer进行调查。 1. Gstreamer整体框架 Gstreamer是一个用于开发流式多媒体应用的开源框架,采用了基于插件(plugin)和管道(pipeline)的体系结构,框架中的所有的功能模块都被实现成可以插拔的组件(component), 并且能够很方便地安装到任意一个管道上。由于所有插件都通过管道机制进行统一的数据交换,因此很容易利用已有的各种插件“组装”出一个功能完善的多媒体应用程序。 Nvidia为Gstreamer开发了许多plugin,这些plugin能够利用Nvidia硬件进行加速。Nvidia的deepstream就是基于gstreamer开发的。 下图是对基于Gstreamer框架的应用的简单分层: 1.1 Media Applications

How to use gst-rtsp-server with own pipeline?

眉间皱痕 提交于 2019-12-02 05:45:16
I am writing gstreamer application and need to transfer output audio/video stream over rtsp. But in gst-rtsp-server examples I have founded factory creation only by gst-launch syntax: factory = gst_rtsp_media_factory_new (); gst_rtsp_media_factory_set_launch (factory, "( appsrc name=mysrc ! videoconvert ! x264enc ! rtph264pay name=pay0 pt=96 )"); Is it possible to connect gst-rtsp-server elements to my pipe? You have to subclass rtsp-media-factory and override default_create_element that will return your pipeline as GstElement 来源: https://stackoverflow.com/questions/22993373/how-to-use-gst

gstreamer appsrc works for xvimagesink but no in theoraenc ! oggmux

三世轮回 提交于 2019-12-02 04:34:24
I am trying to stream cast a computer generated video using gstreamer and icecast, but I cannot get gstreamer appsrc to work. My app works as expected if I use xvimagesink as the sink(see commented code below). But once I pipe it to theoraenc it does not run. I exchanged shout2send with filesink to check if the problem was icecast, the result is that no data is written to the file. Substituting appsrc with testvideosrc works as expected. Any suggestion? #!/usr/bin/env python import sys, os, pygtk, gtk, gobject import pygst pygst.require("0.10") import gst import numpy as np class GTK_Main: def

How to stream h264 with udp gstreamer

a 夏天 提交于 2019-12-02 04:14:09
问题 I'm trying to stream a video with h264. Source is a Axis camera. I managed to stream jpeg with multicast but not h264. With jpeg I used following command: gst-launch-1.0 udpsrc uri=udp://239.194.0.177:1026 ! application/x-rtp,encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegdec ! autovideosink I tried to stream h264 but it fails, used following command: gst-launch-1.0 -v udpsrc host=239.194.0.177 port=1026 ! rtph264depay ! ffdec_h264 ! xvimagesink I get the following error: ERROR: pipeline

Gstreamer : gst_element_factory_make() : always fail and return NULL : Qt5

感情迁移 提交于 2019-12-02 04:03:26
问题 My problem is, I cannot create Gstreamer element. I am creating Gstreamer project using Qt 5.2.1 What I am doing : gst_init( NULL, NULL ); GstElement *m_pipeline = gst_pipeline_new ("pipeline1"); GstElement *m_rtspSrc = gst_element_factory_make("rtspsrc", "MyRtspSrc"); But gst_element_factory_make always return NULL . What I have verified : Checked if shared object is in $(libdir)/gstreamer-0.10/. (It is there). gst-inspect-0.10 rtspsrc . (Its gives details of the plugin). gst-launch-0.10

convert gstreamer pipeline to opencv in python

馋奶兔 提交于 2019-12-02 03:50:13
I have created a network stream with following gstreamer commands: sender: gst-launch-1.0 -v videotestsrc ! video/x-raw,framerate=20/1 ! videoscale ! videoconvert ! x264enc tune=zerolatency bitrate=500 speed-preset=superfast ! rtph264pay ! udpsink host=X.X.X.X port=5000 receiver: gst-launch-1.0 -v udpsrc port=5000 caps = "application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! decodebin ! videoconvert ! autovideosink This just works fine. I now want to include the stream on the receiver side in a python script. In the script

Gst RTSP server programming

落爺英雄遲暮 提交于 2019-12-02 00:29:42
问题 I've installed gst-rtsp-server and I wanted to try a simple code. But on compilation I'm getting the following error: In function `main': test-launch01.c:(.text+0x64): undefined reference to `gst_rtsp_server_new' test-launch01.c:(.text+0x74): undefined reference to `gst_rtsp_server_get_media_mapping' test-launch01.c:(.text+0x7d): undefined reference to `gst_rtsp_media_factory_new' test-launch01.c:(.text+0x95): undefined reference to `gst_rtsp_media_factory_set_shared' test-launch01.c:(.text

Video Output In Tkinter From GStreamer?

风格不统一 提交于 2019-12-01 20:51:13
does anyone know how i would go about using a tkinter window as an output from a videosink/pipeline from within python? i have found methods for lots of other GUI systems, but i dont want to have to use tkinter and something else together xxx thanks in advance x This works for me on Windows 32-bit. I get a seg fault on Linux or Windows 64-bit. Sorry, I don't know about Mac. You have to use bus.connect("sync-message::element", on_sync_message) and pass a Tk widget ID ( winfo_id ), as you can see in the following code. The container can be any Tk widget, but a solid black frame seems to work

How to rip the audio from a video?

假装没事ソ 提交于 2019-12-01 18:55:24
问题 I am on ubuntu and want to convert a mp4 video to an mp3 audio file but can't figure out how. I tried installing ffmpeg but it failed to encode the mp3. I've read the gstreamer does it but I can't figure out how. I have gstreamer and python installed. I can program with python, but am not super comfortable compiling software from source or any higher level command line stuff. I only know the basics on the command line. 回答1: mplayer <videofile> -dumpaudio -dumpfile out.bin it will copy the raw