gstreamer

How to embed video in GTK+ application window using GStreamer & XOverlay?

杀马特。学长 韩版系。学妹 提交于 2019-12-03 16:05:12
I am trying to write a small media player using GTK+ and GStreamer and currently using the XOverlay Interface to embed the video in a GtkDrawing Area INSIDE the mainwindow. The program was compiled using this command: g++ /home/phongcao/cacao.cc -o /home/phongcao/cacao `pkg-config --cflags --libs gtk+-2.0 gstreamer-0.10 gstreamer-plugins-base-0.10 gstreamer-interfaces-0.10` The problem is that the video was displayed in a SEPARATED window (instead of under the toolbar of the main window): Here is the source code of the program: #include <gst/interfaces/xoverlay.h> #include <gtk/gtk.h> #include

GStreamer on Android

梦想与她 提交于 2019-12-03 15:53:26
Can anyone give me any tips on getting GStreamer to work on Android. I have never used it before and I would like to use it with FFmpeg (I already have FFmpeg compiled and works fine on Android). I would just like to use GStreamer to help with some of the processing as learning the FFmpeg API is somewhat of a nightmare haha. Thanks in advance for any help at all! Kapil Agrawal Check http://cgit.freedesktop.org/gstreamer/attic/gst-android/ It could be very useful. Cheers Kapil Try this link as well : http://gstreamer.freedesktop.org/wiki/GstreamerAndroid_InstallInstructions and also subscribe

CMake linking problem

懵懂的女人 提交于 2019-12-03 14:31:23
I am trying to use CMake to compile a C++ application that uses the C library GStreamer. My main.cpp file looks like this: extern "C" { #include <gst/gst.h> #include <glib.h> } int main(int argc, char* argv[]) { GMainLoop *loop; GstElement *pipeline, *source, *demuxer, *decoder, *conv, *sink; GstBus *bus; /* Initialisation */ gst_init (&argc, &argv); return 0; } This works: g++ -Wall $(pkg-config --cflags --libs gstreamer-0.10) main.cpp -o MPEG4GStreamer How to I make it with CMake? My CMakeLists.txt file looks like this: cmake_minimum_required (VERSION 2.6) project (MPEG4GStreamer) add

Streaming RTP/RTSP: sync/timestamp problems

人走茶凉 提交于 2019-12-03 11:47:22
问题 I'm having some trouble streaming H.264 video over RTSP. The goal is to live-stream a camera image to an RTSP client (ideally a browser plugin in the end). This has been working pretty well so far, except for one problem: the video will lag on startup, stutter every few seconds, and has a ~4-second delay. This is bad. Our setup is to encode with x264 (w/ zerolatency & ultrafast) and packed into RTSP/RTP with libavformat from ffmpeg 0.6.5. For testing, I'm receiving the stream with a GStreamer

Green lines in GStreamer video

匿名 (未验证) 提交于 2019-12-03 10:24:21
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I am using Gstreamer 1.0 for video playback, and have a custom area set for showing the video by gst_video_overlay_set_window_handle . The problem is, in some videos green lines at the sides of the video area are shown that somehow are transparent yet quite distracting because of their color. Do you know what caused them or how I can get rid of them? Would appreciate any help :) regards, tagelicht 回答1: I think this bug was reported at https://bugzilla.gnome.org/show_bug.cgi?id=732351 , still unsolved though. I'd recommend you interact there

How to flush gstreamer pipeline

匿名 (未验证) 提交于 2019-12-03 10:10:24
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: Case Reading from a file continuously and feeding to appsrc element. Source - appsrc I have a GStreamer pipeline in PLAYING state. Now would want the pipeline to flush / clean when I press a button that means appsrc queue should be cleared. The playback should start from whatever buffers are now added to / or were added after flush. Issue the APIs I used returned false. I am not able to flush. fprintf(stderr, "The flush event start was <%d>",gst_element_send_event(GST_ELEMENT (pipe), gst_event_new_flush_start()); fprintf(stderr, "The flush

Can I use the Gstreamer API to merge 2 videos?

旧城冷巷雨未停 提交于 2019-12-03 10:09:21
问题 I'd like to write a simple linux CLI application that can take 2 video sources (1 of a presenter talking and 1 with their slides and no audio) and merge them. I'd like the entire output video to be the two original videos, side by side. Failing that, my second best option would be a "picture in picture" style video, with the presenter in a small frame in the corner. From a few hours research, GStreamer looks like it might be able to do this. Can anyone confirm it before I spend more time

Gstreamer 1.0 saving rtsp stream to file

匿名 (未验证) 提交于 2019-12-03 09:06:55
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: Hi I tried to create pipeline in which I get rtsp stream ,encode it to x264 and save it to mp4 file format but it doesn't seem to work . gst-launch-1.0 rtspsrc location=rtsp://ip/url ! videoconvert ! queue ! x264enc ! mp4mux ! filesink location=test.mp4 回答1: Okey I got it: gst-launch-1.0 rtspsrc location=rtsp://ip/url ! rtph264depay ! h264parse ! mp4mux ! filesink location=file.mp4 Explanation: With rtph264depay we extract h264 streams from RTSP then we parse it with h264parse we use mp4 as container and then we save it with filesink 文章来源:

How to include a gstreamer sink in a QML VideoItem?

匿名 (未验证) 提交于 2019-12-03 09:02:45
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: I'm trying to integrate a gsrtreamer video in a QT app using QML. I've begun with the example qmlplayer2 which uses a distant video : player->setUri(QLatin1Literal("http://download.blender.org/peach/bigbuckbunny_movies/big_buck_bunny_480p_surround-fix.avi")); I've modified this example to use a pipeline to get an udpsrc : m_pipeline = QGst::Pipeline::create(); QGst::ElementPtr udp = QGst::ElementFactory::make(QLatin1Literal("udpsrc")); udp->setProperty("address", "192.168.1.1"); udp->setProperty("port", 3333); QGst::ElementPtr decodage =

gStreamer Video Recording Memory Leak

匿名 (未验证) 提交于 2019-12-03 09:02:45
可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试): 问题: HI I am trying to record rtsp stream coming from camera(H264 format). I am using following gst command to do recording in MPEG4 Format gst-launch -e rtspsrc location=rtsp://10.17.8.136/mediainput/h264 latency=100 ! decodebin ! ffenc_mpeg4 ! avimux ! filesink location=test.mp4 and H264 format gst-launch-0.10 -e rtspsrc location="rtsp://10.17.8.136/mediainput/h264" latency=100 ! rtph264depay byte-stream=false ! capsfilter caps="video/x-h264,width=1920,height=1080,framerate=(fraction)25/1" ! mp4mux ! filesink location=testh264.mp4 Both are