gstreamer

Installing Gstreamer-1.0 on Mac OS X Mavericks

我们两清 提交于 2019-12-18 13:22:32
问题 I want to install Gstreamer-1.0 on Mac OS X Mavericks. So I already install gstreamer-1.0-1.6.0-x86_64.pkg and gstreamer-1.0-devel-1.6.0-x86_64.pkg from here. After that I tried to run something like this: gst-launch-1.0 fakesrc ! fakesink But got error: -bash: gst-launch-1.0: command not found So how I can install and use Gstreamer-1.0 on Mac OS X Mavericks? 回答1: From https://stackoverflow.com/a/30873313/1162305 Try installing them with the following commands from your terminal: brew install

Gstreamer tcpserversink v0.10 vs 1.0 and HTML5 video tag

南楼画角 提交于 2019-12-18 12:37:12
问题 I am embedding a HTML5 video tag in my site, the source being a gstreamer stream. I have a pipeline working on gst 0.10: gst-launch-0.10 -v videotestsrc ! theoraenc ! oggmux ! queue ! tcpserversink port=8080 sync-method=2 I can connect to this stream via vlc like so: vlc tcp://localhost:8080 And I can also use the URL in a HTML5 video tag and the video is displayed as expected. Now I try to adapt this for gst 1.0: gst-launch-1.0 -v videotestsrc ! theoraenc ! oggmux ! queue ! tcpserversink

How to write opencv mat to gstreamer pipeline?

一个人想着一个人 提交于 2019-12-18 10:48:11
问题 I want to add some opencv processes to a gstreamer pipeline and then send it over udpsink. I'm able to read frames from gstreamer like this: // may add some plugins to the pipeline later cv::VideoCapture cap("v4l2src ! video/x-raw, framerate=30/1, width=640, height=480, format=RGB ! videoconvert ! appsink"); cv::Mat frame; while(ture){ cap >> frame; // do some processing to the frame } But what can't figure out is how to pass the processed frame to the following pipeline: appsrc ! x264enc !

How to resume playing after paused using gstreamer?

回眸只為那壹抹淺笑 提交于 2019-12-18 09:36:30
问题 I've written C++ wrapper for each Gstreamer types. They're simple and intuitive, so I don't think their implementation needs to be posted here (though I could post them (maybe at github) if need arises). The problem I'm facing is that I start playing a video (and simulteneously saving it to a file using gst tee element)....and while it is playing, I pause (from different thread) which is working great. However, when I want to resume it, it doesn't work: void pause() { _pipeline.state(GST

Using Gstreamer to capture screen and show it in a window?

≡放荡痞女 提交于 2019-12-18 06:49:47
问题 I need to capture the screen of the second display and "monitor" it in the main display, inside a window (scaled at x0.5 and with neighbor interpolation because I prefer performance against quality). From this link, I've got this screencast command: gst-launch ximagesrc ! ffmpegcolorspace ! queue \ ! vp8enc quality=10 speed=2 ! mux. alsasrc ! audio/x-raw-int ! queue \ ! audioconvert ! vorbisenc ! mux. webmmux name=mux \ ! filesink location=screencast.webm ... but it capture to a file (not a

How to implement a video widget in Qt that builds upon GStreamer?

百般思念 提交于 2019-12-17 16:26:22
问题 I want to use Qt to create a simple GUI application that can play a local video file. I could use Phonon which does all the work behind the scenes, but I need to have a little more control. I have already succeeded in implementing an GStreamer pipeline using the decodebin and autovideosink elements. Now I want to use a Qt widget to channel the output to. Has anyone ever succeeded in doing this? (I suppose so since there are Qt-based video players that build upon GStreamer.) Can someone point

Write opencv frames into gstreamer rtsp server pipeline

此生再无相见时 提交于 2019-12-17 15:46:23
问题 I'm trying to put opencv images into a gstreamer rtsp server in python. I have some issue writing in the mediafactory, I'm new to gst-rtsp-server ancd there's little documentation so I don't know exactly if I'm using the right approach. I'm using a thread to start the MainLoop and I'm using the main thread to create a buffer to push in the appsrc element of the mediafactory pipeline. Am I using the right approach to obtain my objective? Can anyone help me? My code is below: from threading

Using custom camera in OpenCV (via GStreamer)

笑着哭i 提交于 2019-12-17 10:46:40
问题 I'm using Nitrogen6x board with ov5640 camera(mipi). The camera is not using standard v4l/v4l, but we can stream video using GStreamer for its driver (mfw_v4l): gst-launch mfw_v4lsrc ! autovideosink I want to use the camera in OpenCV by calling it via GStreamer (GStreamer inside OpenCV). I asked a question about calling GStreamer inside OpenCV here, and this is the follow up. If I enable GStreamer support, it's checked in the source code, but OpenCV tries to use standard V4L/V4L2 for

Loading shared libs that depend on other shared libs

北城余情 提交于 2019-12-17 10:43:10
问题 Problem: I am building Android app in Eclipse which uses shared lib libgstreamer-0.10.so (GStreamer-android NDK Bundle libs compiled for android-8 platform) . I made new folder libs/armeabi in project root folder and put it there. Also, I have put all other libs that came with it (158 of them) in the same folder. If I put this in my main activity code: static{ System.loadLibrary("gstreamer-0.10"); } And build/install/run my app on Android-8 emulator, it throws this error: 06-15 21:54:00.835:

Gstreamer examples in Android Studio

独自空忆成欢 提交于 2019-12-17 07:49:24
问题 I have been trying to get Gstreamer working in Android studio, following their tutorials, see for example here: https://gstreamer.freedesktop.org/documentation/tutorials/android/link-against-gstreamer.html But in the latest Android studio there is no jni/Android.mk. Where do I put the code at the end of that web page? Should it go in the CMakeLists.txt? Or should something different go in there? Or do I just make an Android.mk file, and if so, where (as there is no jni folder, only a cpp