gstreamer

Gstreamer message to signal new frame from video source (webcam)

你离开我真会死。 提交于 2019-12-10 15:28:49
问题 I am trying to save a stream from webcam as series of image using gstreamer. I have written this code so far... #!/usr/bin/python import sys, os import pygtk, gtk, gobject import pygst pygst.require("0.10") import gst def __init__(self): #.... # Code to create a gtk Window #.... self.player = gst.Pipeline("player") source = gst.element_factory_make("v4l2src", "video-source") sink = gst.element_factory_make("xvimagesink", "video-output") caps = gst.Caps("video/x-raw-yuv, width=640, height=480"

GstBuffer to color Mat

匆匆过客 提交于 2019-12-10 11:57:49
问题 GstBuffer *gstImageBuffer = gst_app_sink_pull_buffer((GstAppSink*)app_data.gst_data.sink); Mat matLeft = Mat(Size(width, height),CV_8U, (char*)GST_BUFFER_DATA(gstImageBuffer)); I can pull the buffer and convert to a OpenCv Mat object, but in grayscale. I want to pull the buffer in color. I know my stream is in color creating the same pipeline in the terminal. I've searched for a solution without success. I am using gstreamer-0.10 and opencv-2.4.10. 回答1: Finally with the help of this post

Gstreamer: rtsp server: authentication in another thread

六月ゝ 毕业季﹏ 提交于 2019-12-10 11:57:11
问题 My application returns data for authentification in another thread. I use: auth = gst_rtsp_auth_new(); GstRTSPAuthClass* klass = GST_RTSP_AUTH_GET_CLASS(auth); klass->authenticate = authentificateAndAuthorizeAsync; ... gboolean authentificateAndAuthorizeAsync(GstRTSPAuth *auth, GstRTSPContext *ctx) { /*can send required answer in another thread*/ return true; } How can I use asynchronous authentification without blocking the first thread? C++-tools like condition_variables and future/promise

Gstreamer in Android. UDP stream

≯℡__Kan透↙ 提交于 2019-12-10 11:05:40
问题 I have setup my raspberry pi with the camera board. The code in rpi is: raspivid -t 999999 -h 720 -w 1080 -fps 25 -b 2000000 -o - | gst-launch-0.10 -v fdsrc fd=0 ! h264parse ! rtph264pay ! udpsink host=192.168.2.1 port=5000 and then I run in my mac: gst-launch-1.0 -v udpsrc port=5000 ! application/x-rtp,payload=96,media=video,clock-rate=90000,encoding-name=H264,sprop-parameter-sets=\"J2QAH6wrQCIC3y8A8SJq\\,KO4CXLA\\=\" ! rtph264depay ! avdec_h264 ! videoconvert ! autovideosink sync=false The

Gstreamer multifilesink wav files splitting

徘徊边缘 提交于 2019-12-10 10:48:50
问题 I have problem with recording streams using gstreamer. I have to write audio and video separately and cut in when signal arrived. I have correctly working video, but still have problems with wav files. Even simple pipeline in gst-launch don't work correctly. I have wave file and I am trying to split it using multifilesink: gst-launch filesrc location=test.wav ! multifilesink location=test2%d.wav next-file=4 max-file-size=512000 But final wav files are corrupted while the same pipeline with ts

gstreamer playbin - setting uri on windows

回眸只為那壹抹淺笑 提交于 2019-12-10 10:41:01
问题 I am trying to play some audio files with the CLI example on this site: http://pygstdocs.berlios.de/pygst-tutorial/playbin.html http://pygstdocs.berlios.de/pygst-tutorial/playbin.html I am on windows and it is giving error while reading the file. I specified the following path: $ python cliplayer.py C:\\voice.mp3 0:00:00.125000000 3788 009DA010 ERROR basesrc gstbasesrc.c:2834:gst_base_src_activate_pull:<source> Failed to start in pull mode Error: Could not open resource for reading. ..\..\..

gstreamer: how to shift the time of rendering of one stream taken from file

浪尽此生 提交于 2019-12-10 10:37:47
问题 I have two media files (say, "file0" and "file1") and I want to merge them into a single one with "picture-in-picture" effect - the content from "file0" to be displayed on the whole window, and the content from "file1" will be shown on the top-left corner in the smaller box. One more point is that the content from "file1" should be rendered some time later from the base time, at the point marked as "X1" on the diagram below. In other words, if I take "videotestsrc" as a video source input, I

.so with numerals after that, how to match them in find_library in cmake ? Error in linking shared objects which are found as sub-dependencies

喜欢而已 提交于 2019-12-10 10:36:11
问题 Given ls -lrt /usr/lib/libvpx* results lrwxrwxrwx 1 root root 15 Feb 9 2012 /usr/lib/libvpx.so.1.0 ->libvpx.so.1.0.0 lrwxrwxrwx 1 root root 15 Feb 9 2012 /usr/lib/libvpx.so.1 -> libvpx.so.1.0.0 -rw-r--r-- 1 root root 646120 Feb 9 2012 /usr/lib/libvpx.so.1.0.0 ls -lrt /usr/lib/libschroedinger* results lrwxrwxrwx 1 root root 29 Feb 8 2012 /usr/lib/libschroedinger-1.0.so.0 ->libschroedinger-1.0.so.0.11.0 -rw-r--r-- 1 root root 774044 Feb 8 2012 /usr/lib/libschroedinger-1.0.so.0.11.0 ls -lrt /usr

Combining an audio and video stream using gstreamer [closed]

放肆的年华 提交于 2019-12-10 04:28:49
问题 Closed. This question is off-topic. It is not currently accepting answers. Want to improve this question? Update the question so it's on-topic for Stack Overflow. Closed 6 years ago . I am streaming an mp4(mpeg-4) file from one device to another using gstreamer over RTP stream. Basically I am splitting up the mp4 file into its audio and video file and then sending it all to the other device where it gets streamed. Now, I want to save the mp4 file to disk in the other device, but my problem is

Is it possible to do appsrc--ximagesink setup in Gstreamer

你离开我真会死。 提交于 2019-12-10 00:44:39
问题 I want to stream some random bytes to Gstreamer and display it as follows: [Rand Bytes]--[Video source=appsrc]--[Video sink=ximagesink] The following Python code I found in this SO post works source = gst.element_factory_make("appsrc", "source") caps = gst.Caps("video/x-raw-gray,bpp=16,endianness=1234,width=320,height=240,framerate=(fraction)10/1") source.set_property('caps',caps) source.set_property('blocksize',320*240*2) source.connect('need-data', self.genRandBytes) colorspace = gst