gstreamer

Gstreamer说明

最后都变了- 提交于 2019-12-15 09:56:41
Gstreamer说明 一 Gstreamer简介 是一个框架,灵活轻便。 第一部分基本没有难度,只要能看懂英文。从我目前接触的感觉上看,Gstreamer确实简化了动态 库的加载,模块与模块间的合作。 但是Gstreamer用得还是有点不太习惯,可能是 GLIB这种风格没有适应。 gstreamer整个分为: l core:核心库 l 基础插件:一些很基础的插件 l 好插件:编写质量较好的遵循LGPL协议的插件 l 坏插件:有待改进的插件 l 其他库 1.1 核 心库 核心库是不了解任何媒体信息的,它只是一个框架,将所有单元联系起来。 单元是gstreamer里的核心概念。 二 基础知识 2.1 单元 Element是构成管道的组件, 每个element实际就是一个插件,在gst中得到组装成一个pipe,数据从源单元流向目的单元,完成整个流程。单元间是可以链接起来的(必须得链接起来以组 成pipe)。 2.2 Pad pad是一个单元的输入输出端口, 通过pad, 才能将两个单元链接到一起。对输入来说,pad就是一个插口,对输出来说pad就是一个塞子。pad有自己的规格,所以不同规格的pad就限制了数据的规格。只有规格相符的pad才能链接到一起。 l 规格协商的过程叫caps negotiation l 数据类型叫GstCaps 2.3 盒子和管道 盒子Bin是一组单元的集合

Targeting Qt child widget with gstreamer

社会主义新天地 提交于 2019-12-14 04:05:06
问题 I have a gstreamer pipeline which ends with a xvimagesink element. To have the video displayed in a particular window , I can use the x_oerlay_interface : gst_x_overlay_set_xwindow_id(GST_X_OVERLAY(xvsink), winid); So far, so good. However, it only works if winid is the idea of a top level window, which is not the case of child widget. Let's say I have : A dialog widget DialogWidget A video widget VideoWidget, which is a child of DialogWidget. If I use DialogWidget->winId() , then the video

Playing an rtp stream on android published with gstreamer

谁说胖子不能爱 提交于 2019-12-14 03:44:35
问题 I'm trying to get a rtp connection between a microphone on a desktop pc and an android smartphone. I grab the data using gstreamer. Because of other applications using this microphone at the same time in the same system, there is an tcpsink, in which the data is published. this is done with this call: gst-launch-0.10 -v alsasrc ! 'audio/x-raw-int, depth=16, width=16, \ endianness=1234, channels=1, rate=16000' ! \ tcpserversink host=localhost port=20000 then I create a second stream, which

GStreamer How to extract video frame from the flow?

五迷三道 提交于 2019-12-14 03:42:49
问题 This is python code for capturing streaming video from server. but I need to write a function to extract one frame from the flow. It will be a button. On click it will show current frame. I have no ideas. Can anyone help me with this??? self.player = gst.Pipeline("player") self.source = gst.element_factory_make("uridecodebin", "video-source") #self.source = gst.element_factory_make("playbin2", "video-source") sink = gst.element_factory_make("xvimagesink", "video-output") colorspace = gst

Network streaming using Gstreamer

安稳与你 提交于 2019-12-14 02:16:51
问题 I tried the following basic pipelines to play audio over a network: Server: gst-launch-0.10 -v audiotestsrc ! udpsink host=127.0.0.1 port=1234 Client: gst-launch-0.10 -v udpsrc port=1234 ! fakesink dump=1 But I get no output , although the pipeline gets set to PLAYING state . I looked at other questions such as this one : Webcam streaming using gstreamer over UDP Although it's the same pipeline there too, it doesn't work for me. What am i doing wrong? 来源: https://stackoverflow.com/questions

GStreamer RTP packet size

时光怂恿深爱的人放手 提交于 2019-12-13 19:31:52
问题 I'm running the following GStreamer command: gst-launch-1.0 -v filesrc location=audiofile.mp3 ! mad ! audioconvert ! rtpL16pay mtu=1024 ! udpsink port=5005 host=127.0.0.1 This sets up a RTP stream with a maximum packet size of 1024 bytes (Maximum Transmission Unit). When I run this stream, I end up getting a sequence of 4 packets of size 1024 followed by 1 packet of size 572. This sequence is repeated for the duration of the file. Why is this happening, and is there any way to ensure a

Gstreamer, rtspsrc and payload type

折月煮酒 提交于 2019-12-13 15:14:22
问题 I'm having difficulties in retrieving rtsp stream from a specific camera, because the rtp payload type the camera is providing is 35 (unassigned) and payload types accepted by the rtph264depay plugin are in range [96-127]. The result is that gstreamer displays ann error like: <udpsrc0> error: Internal data flow error. <udpsrc0> error: streaming task paused, reason not-linked (-1) Other cameras that I have tested are working because they define a good payload type. FFmpeg, MPlayer and other

OpenCV error - cannot put pipeline to play in function CvVideoWriter_GStreamer::open

旧城冷巷雨未停 提交于 2019-12-13 12:35:31
问题 I am trying to create video writer objects in OpenCV to write frames from my webcam into a file. However I'm getting the following error on this line. name1 = "videos/cam1.avi" out = cv2.VideoWriter(name1,cv2.cv.CV_FOURCC('M','J','P','G'), 20.0, (1920,1080)) Error output: File "TestWebcam.py", line 14, in takeImage out = cv2.VideoWriter(name1,cv2.cv.CV_FOURCC('M','J','P','G'), 20.0, (1920,1080)) cv2.error: /home/odroid/software/opencv/opencv-2.4.13/modules/highgui/src/cap_gstreamer.cpp:1528:

image data as source in gstreamer [closed]

做~自己de王妃 提交于 2019-12-13 09:49:58
问题 Closed. This question is off-topic. It is not currently accepting answers. Want to improve this question? Update the question so it's on-topic for Stack Overflow. Closed 5 years ago . I want to make a GStreamer application which takes image data as source instead of a file location. My intention is to display image received through tcp. When tcp got an image data(byte array) it should pass the data to the gstreamer plugin directly with out saving it locally. 回答1: The multifilesrc should do

Rotate a Video in gstreamer

半城伤御伤魂 提交于 2019-12-13 08:27:35
问题 I have this pipeline to record from two webcams simultaneously: gst-launch-1.0 -v v4l2src device=/dev/video0 num-buffers=300\ ! "video/x-raw,width=800,height=600,framerate=30/1" ! videorate\ ! "video/x-raw,framerate=30/1" ! jpegenc ! queue ! mux. \ pulsesrc device="alsa_input.pci-0000_00_1b.0.analog-stereo" \ ! 'audio/x-raw,rate=88200,channels=1,depth=24' ! audioconvert ! \ avenc_aac compliance=experimental ! queue ! mux. matroskamux name="mux"\ ! filesink location=/home/sina/T1.avi v4l2src