gstreamer

openCV VideoCapture doesn't work with gstreamer x264

…衆ロ難τιáo~ 提交于 2019-12-23 18:08:49
问题 I'd like to display a rtp / vp8 video stream that comes from gstreamer, in openCV. I have already a working solution which is implemented like this : gst-launch-0.10 udpsrc port=6666 ! "application/x-rtp,media=(string)video,clock-rate=(int)90000,encoding-name=(string)VP8-DRAFT-IETF-01,payload=(int)120" ! rtpvp8depay ! vp8dec ! ffmpegcolorspace ! ffenc_mpeg4 ! filesink location=videoStream Basically it grabs incoming data from a UDP socket, depacketize rtp, decode vp8, pass to ffmpegcolorspace

Gstreamer-sharp running on Linux

不问归期 提交于 2019-12-23 05:40:10
问题 since I'm currently working on a gstreamer binding for my project and got the information, that gstreamer-sharp 0.99.x is only working with glib-sharp 2.99.x, I thought it a good idea to create a test project. So I downloaded the addin for monodevelop named "GTK# 3 Simple Project Template" and compiled it -> worked. Now I added the gstreamer-sharp 0.99 dll I compiled before and added the following code: Gst.Application.Init(); Element music = Parse.Launch("playbin uri=\"file:////media

Use gst-launch to output a video frame to certain position on framebuffer

自作多情 提交于 2019-12-23 05:29:25
问题 Currently we are using below command to play a video clip: gst-launch filesrc location=/media/sda1/mpeg4_640x480.mp4 ! decodebin2 name=dec ! queue ! ffmpegcolorspace ! videoscale ! video/x-raw-rgb,width=320, height=240 ! fbdevsink dec. ! queue ! audioconvert ! autoaudiosink The video frame is reiszed to 320x240 and outputted to the framebuffer. However, we'd like to set the video frame to certain (x, y). Is that possible? 回答1: Try using the "videobox" element. In the top, left, bottom, and

GST_DEBUG: How to save logs in a separate file for a thread inside an application

醉酒当歌 提交于 2019-12-23 04:53:20
问题 I am running a sample program of gstreamer which is being called from a C++ application as a thread. Have set the GST_DEBUG=*:5 level to capture all the possible scenarios. The application also prints lots and lots of logs on stdout and the gstreamer thread also does the same( the level 5 adds to the misery). Question -- Is there a way to separate out the log printing of gstreamer thread in a file with the given debug level ? Supplementary question -- Set the GST_DEBUG_FILE based on answer

Recording RTSP stream

我是研究僧i 提交于 2019-12-23 04:31:32
问题 I want to record video data coming from Camera(through RTSP H.264). Can anybody help me how to record rtsp stream using gstreamer?(Please provide gstreamer command line details). Recording will be in MPEG4 formate Regards Kiran 回答1: This will stream the video and output to your screen. gst-launch rtspsrc location=rtsp://some.server/url ! decodebin ! xvimagesink gst-launch uridecodebin uri=rtsp://some.server/url ! xvimagesink To record the stream to your drive using MPEG4: gst-launch rtspsrc

Recording RTSP stream

纵饮孤独 提交于 2019-12-23 04:31:05
问题 I want to record video data coming from Camera(through RTSP H.264). Can anybody help me how to record rtsp stream using gstreamer?(Please provide gstreamer command line details). Recording will be in MPEG4 formate Regards Kiran 回答1: This will stream the video and output to your screen. gst-launch rtspsrc location=rtsp://some.server/url ! decodebin ! xvimagesink gst-launch uridecodebin uri=rtsp://some.server/url ! xvimagesink To record the stream to your drive using MPEG4: gst-launch rtspsrc

Sending eos in gstreamer after pre-defined time using new_single_shot_id

China☆狼群 提交于 2019-12-23 04:24:29
问题 I have a gstreamer application where I am creating a video with images. I need to create the video for a predefined time. I would like to send eos after the predefined time. I know that this can be achieved using new_single_shot_id in gstClock. But I could not find any example on how to use new_single_shot_id to create a trigger which is bound to a function that sends eos to pipeline. My simplified pipeline code is like this. class Main(object): def __init__(self, location): self.pipeline =

Unable to play .wav file using gstreamer apis

為{幸葍}努か 提交于 2019-12-23 02:17:32
问题 Following code is written to play a .wav file but it doesn't seem to work. I would like to know if i am missing something in it. Code: #include <gst/gst.h> #include <glib.h> int main(int argc , char *argv[]) { GMainLoop *loop; GstElement *source,*audioparser,*sink,*pipeline; GstBus *bus; gst_init(&argc,&argv); // create a pipeline loop = g_main_loop_new (NULL, FALSE); pipeline = gst_pipeline_new ("wav-player"); source = gst_element_factory_make("filesrc","file-source"); audioparser = gst

How to convert raw BGRA image to JPG using GStreamer 1.0?

依然范特西╮ 提交于 2019-12-23 01:46:31
问题 I'm trying to display a raw image (1.8MB) with gst-launch-1.0 . I understand that the data needs to be encoded to JPG before this can be achieve. If image was already stored as a jpg file the story would be quite simple: gst-launch-1.0.exe -v filesrc location=output.jpg ! decodebin ! imagefreeze ! autovideosink However, I need to assemble the pipeline to display a raw BGRA 800x600 image (looks the same as the above) that was dumped to the disk by a 3D application. This is what I've done so

H264 RTP stream with gstreamer-1.0

孤街浪徒 提交于 2019-12-22 18:34:41
问题 I try to make a H264 RTP stream from a Raspberry Pi 3 with a camera module to a video tag. Using the following code to start the stream raspivid -t 0 -h 720 -w 1080 -fps 25 -hf -b 2000000 -o - | \ gst-launch-1.0 -v fdsrc \ ! h264parse \ ! rtph264pay \ ! gdppay \ ! udpsink host="192.168.0.11" port=5000 Then I provide a simple webpage with a video tag: <video id="videoTag" src="h264.sdp" autoplay> <p class="warning">Your browser does not support the video tag.</p> </video> The src references