gstreamer

How to create video thumbnails with Python and Gstreamer

核能气质少年 提交于 2019-11-29 09:56:01
I'd like to create thumbnails for MPEG-4 AVC videos using Gstreamer and Python. Essentially: Open the video file Seek to a certain point in time (e.g. 5 seconds) Grab the frame at that time Save the frame to disc as a .jpg file I've been looking at this other similar question , but I cannot quite figure out how to do the seek and frame capture automatically without user input. So in summary, how can I capture a video thumbnail with Gstreamer and Python as per the steps above? daf To elaborate on ensonic 's answer, here's an example: import os import sys import gst def get_frame(path, offset=5,

gstreamer appsrc test application

主宰稳场 提交于 2019-11-29 08:42:12
I am trying to learn gstreamer appsrc plugin to play AV from a transport stream demultiplexer that I wrote (I know plugins are already available, I wanted to do it myself to learn). I have extracted audio and video elementary streams from the MPEG transport stream; now I have to push it to the appsrc plugin and play it using a gst pipeline (this part is not yet clear to me: as to which plugins to use - any tips will be highly appreciated). I found a sample code on using appsrc , but when I run that, there is no output. I verified that start_feed and read_data functions are indeed invoked. In

Unable to build GStreamer tutorials using Android Studio

纵然是瞬间 提交于 2019-11-29 07:35:42
I am trying to build the tutorials that are bundled with gstreamer-sdk-android-arm-debug-2013.6 . The Android.mk file in the src/jni directory (tutorial 1 project) references environment variables such as GSTREAMER_SDK_ROOT . From what I have read, Android Studio does not use/pass environment variables to the build scripts. Is there a best practice for modifying makefiles and for defining/retrieving the key/value pairs required by the build scripts? Ok, I have a working solution. You CAN pass environment variables to ndk-build (or any other process spawned by gradle Exec). In my case, I wanted

GStreamer基础教程01——Hello World

混江龙づ霸主 提交于 2019-11-29 05:50:33
目标 对于一个软件库来说,没有比在屏幕上打印出Hello World更近直观的第一印象了。因为我们是在和一个多媒体的framework打交道,所以我们准备播放一段视频来代替Hello World。不要被下面的代码吓唬住了——真正起作用的也就四行而已。剩下的都是资源管理的代码,C语言嘛,就是有这个麻烦。不多说了,准备你的第一个GStreamer应用吧… … Hello World 把下面的代码copy到一个文本文件,并改名为basic-tutorial-1.c # include <gst/gst.h> int main ( int argc, char *argv[]) { GstElement *pipeline; GstBus *bus; GstMessage *msg; /* Initialize GStreamer */ gst_init (&argc, &argv); /* Build the pipeline */ pipeline = gst_parse_launch ( "playbin2 uri=http://docs.gstreamer.com/media/sintel_trailer-480p.webm" , NULL ); /* Start playing */ gst_element_set_state (pipeline, GST_STATE

gstreamer code for playing avi file is hanging

末鹿安然 提交于 2019-11-29 05:12:32
I am new to gstremaer. I have written a code for playing avi file using gstreamer. But on executing the code it just hangs after a while, I am unable to debug whats the problem, Can someone help me please. The code and the output is as below: Code: #include<stdio.h> #include<gst/gst.h> #include<glib.h> //Function to process message on bus of pipeline gboolean process_message(GstBus *bus, GstMessage *msg,gpointer data); //Function to add pad dynamically for ogg demux void dynamic_addpad(GstElement *element, GstPad *pad, gpointer data); void dynamic_decodepad (GstElement* object, GstPad* arg0,

Adding and removing audio sources to/from GStreamer pipeline on-the-go

风格不统一 提交于 2019-11-29 00:19:49
I wrote a little Python script which uses an Adder plugin to mix two source streams together. After starting the program, you hear a 1kHz tone generated by the audiotestsrc plugin. When you press Enter, an another 500Hz test tone is connected to the Adder so you hear them together. (By the way, i don't really get why should i set the pipeline again to playing state here to hear the mix. Is there any way i can plug in new sources without having to restart the pipeline?) When you press Enter once again, the 1kHz tone should be removed from the mix and the 500Hz tone should keep playing, but

Capturing h.264 stream from camera with Gstreamer

半腔热情 提交于 2019-11-28 21:34:41
I'm trying to capture H264 stream from locally installed Logitech C920 camera from /dev/video0 with Gstreamer 1.0 v4l2src element. v4l2-ctl --list-formats shows that camera is capable to give H264 video format: # v4l2-ctl --list-formats ioctl: VIDIOC_ENUM_FMT ... Index : 1 Type : Video Capture Pixel Format: 'H264' (compressed) Name : H.264 ... But pipeline # gst-launch-1.0 -vvv v4l2src device=/dev/video0 ! video/x-h264, width=800, height=448, framerate=30/1 ! fakesink keeps giving me not-negotiated (-4) error: /GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-h264, width=

gstreamer python bindings for windows

牧云@^-^@ 提交于 2019-11-28 20:54:26
I am looking into gstreamer as a means to choose a video device from a list to feed it to an opencv script. I absolutely do not understand how to use gstreamer with python in windows. I installed the Windows gstreamer 1.07 binaries from the gstreamer official website . However, I could not import the pygst and gst modules in python. >>> import pygst Traceback (most recent call last): File "<pyshell#0>", line 1, in <module> import pygst ImportError: No module named pygst >>> I checked the gstreamer installation, and there seems to be no pygst.py provided. There is however a file named gst-env

Adding opencv processing to gstreamer application

孤街醉人 提交于 2019-11-28 20:36:24
问题 I'm trying to do the following: receive video stream using gstreamer and process it with opencv. I've found few solutions, and one of them is to write video into (from gstreamer) fifo and then read it using opencv. (OPTION3 here MJPEG streaming and decoding). The problem is I cant open pipe. cvCreateFileCapture just never returns. Here is a part code I wrote: if(mkfifo("fifo.avi", S_IRUSR| S_IWUSR) == -1) { cout<<"Cant create fifo"<<endl; cout<<errno<<endl; } loop = g_main_loop_new(NULL,

Stream H.264 video over rtp using gstreamer

主宰稳场 提交于 2019-11-28 18:09:44
问题 I am newbie with gstreamer and I am trying to be used with it. My first target is to create a simple rtp stream of h264 video between two devices. I am using these two pipelines: Sender: gst-launch-1.0 -v filesrc location=c:\\tmp\\sample_h264.mov ! x264enc ! rtph264pay ! udpsink host=127.0.0.1 port=5000 Receiver: gst-launch-1.0 -v udpsrc port=5000 ! rtpmp2tdepay ! decodebin ! autovideosink But with the first one (the sender) I got the following error: Setting pipeline to PAUSED ... Pipeline