rtp

Read dumepd RTP stream in libav

浪子不回头ぞ 提交于 2019-12-19 04:51:09
问题 Hi I am in a need of a bit of a help/guidance because I got stuck in my research. The problem: How to convert RTP data using either gstreamer or avlib (ffmpeg) in either API (by programming) or console versions. Data I have RTP dump that comes from RTP/RTCP over TCP so I can get the precise start and stop for each RTP packet in file. It's a H264 video stream dump. The data is in this fashion because I need to acquire the RTCP/RTP interleaved stream via libcurl (which I'm currently doing)

H264 NAL unit prefixes

你离开我真会死。 提交于 2019-12-18 12:39:02
问题 I need some clarification on H264 NAL unit delimiter prefixes ( 00 00 00 01 and 00 00 01 ), I am using Intel Media SDK to generate a H264 and pack it into RTP. The issue is that so far I was looking only for 00 00 00 01 as a unit separator and basically was able to find only AUD,SPS,PPS and SEI units in the bitstream. Looking at the memory I saw that after the SEI there was a byte sequence 00 00 01 25 that could be a start of an IDR unit, but my search algorithm did not detect it because of a

h264 RTP timestamp

孤街浪徒 提交于 2019-12-18 12:33:17
问题 I have a confusion about the timestamp of h264 RTP packet. I know the wall clock rate of video is 90KHz which I defined in the SIP SDP. The frame rate of my encoder is not exactly 30 FPS, it is variable. It varies from 15 FPS to 30 FPS on the fly. So, I cannot use any fixed timestamp. Could any one tell me the timestamp of the following encoded packet. After 0 milisecond encoded RTP timestamp = 0 (Let the starting timestamp 0) After 50 milisecond encoded RTP timestamp = ? After 40 milisecond

Live555 on Android

谁说胖子不能爱 提交于 2019-12-18 03:43:54
问题 I'm trying to get the RTSP video stream play in my Android App using the build-in Videoview/MediaPlayer, but there're always various problems on different ROMs or different network status(UDP packets blocked), it's really annoying so I want to implement my own rtsp client with the live555 source and GLES and ffmpeg. I can figure out how to use ffmpeg and GLES to show a video, but I'm not familiar with live555. Are there any compiled version of live555 on Android? or how could I do that myself

VoIP RTP Streaming from/to server (in Java) to/from android

℡╲_俬逩灬. 提交于 2019-12-17 14:44:23
问题 My target is to have a push-to-talk chat app in GSM/UMTS/LTE networks; initially I wanted use multicast addresses and peer-to-peer without overload the server; unfortunatly, after deep investigation, I discovered that multicast is not allowed in GSM/UMTS/LTE networks, therefore I have to use the server in order to bounce the VoIP packets. I' don't like very much this solution because I have to overload the server, but I didn't find any better solution. If you have an alternative solution is

关于视频直播系统源码开发的相关知识点汇总,看过来就对了

≯℡__Kan透↙ 提交于 2019-12-17 11:53:21
【推荐】2019 Java 开发者跳槽指南.pdf(吐血整理) >>> 视频直播系统开发所包含软件的范围是比较广的,短视频系统、直播源码、一对一社交软件、语音直播等等的开发都会用到相关的知识。随着视频直播行业近些年来的持续性火热,软件开发行业也多了起来,这里讲一下关于 视频直播系统开发 的一些知识点 首先我们来看一下 视频直播系统开发需要用到哪方面 的知识: 无论是短视频系统、一对一系统还是一对多系统的开发,都包含语音和视频的功能,并且都会用到流媒体的传输,都需要做好系统的兼容性以及产品的并发。总的来说是下面几个方面: 语音视频的几个模块:回声消除,噪声抑制,自动增益,丢帧补偿,前向纠错,网络抖动。 流媒体传输的各个协议:RTMP,WEBRTC,HLS,HTTP-FLV,RTP/RTCP。 终端的兼容性: 苹果的系统是想对简单的,全面兼容安卓是比较复杂的事情 海量用户并发支持:这 方面是 需要经验 的 , 对于 如果做过海量用户并发的大规模系统 的技术来讲 ,这个就不是问题。 各个终端:MAC, WINDOWS, IOS, ANDROID. 视频直播类的系统源码,在安装的时候主要是包含以下的几个步骤: 1.源码安装需要你的服务器空间支持PHP+Mysql; 2.源码包必须完整的上传到空间,并解压; 3.执行http://域名//install根据提示完成安装; 4.安装完成

Streaming via RTSP or RTP in HTML5

*爱你&永不变心* 提交于 2019-12-16 20:04:44
问题 I'm building a web app that should play back an RTSP/RTP stream from a server http://lscube.org/projects/feng . Does the HTML5 video/audio tag support the rtsp or rtp? If not, what would the easiest solution be? Perhaps drop down to a VLC plugin or something like that. 回答1: Technically 'Yes' (but not really...) HTML 5's <video> tag is protocol agnostic—it does not care. You place the protocol in the src attribute as part of the URL. E.g.: <video src="rtp://myserver.com/path/to/stream"> Your

Playing an rtp stream on android published with gstreamer

谁说胖子不能爱 提交于 2019-12-14 03:44:35
问题 I'm trying to get a rtp connection between a microphone on a desktop pc and an android smartphone. I grab the data using gstreamer. Because of other applications using this microphone at the same time in the same system, there is an tcpsink, in which the data is published. this is done with this call: gst-launch-0.10 -v alsasrc ! 'audio/x-raw-int, depth=16, width=16, \ endianness=1234, channels=1, rate=16000' ! \ tcpserversink host=localhost port=20000 then I create a second stream, which

RTP AAC Packet Depacketizer

℡╲_俬逩灬. 提交于 2019-12-14 02:28:14
问题 I asked earlier about H264 at RTP H.264 Packet Depacketizer My question now is about the audio packets. I noticed via the RTP packets that audio frames like AAC, G.711, G.726 and others all have the Marker Bit set. I think frames are independent. am I right? My question is: Audio is small, but I know that I can have more than one frame per RTP ​​packet. Independent of how many frames I have, they are complete? Or it may be fragmented between RTP packets. 回答1: The difference between audio and

GStreamer RTP packet size

时光怂恿深爱的人放手 提交于 2019-12-13 19:31:52
问题 I'm running the following GStreamer command: gst-launch-1.0 -v filesrc location=audiofile.mp3 ! mad ! audioconvert ! rtpL16pay mtu=1024 ! udpsink port=5005 host=127.0.0.1 This sets up a RTP stream with a maximum packet size of 1024 bytes (Maximum Transmission Unit). When I run this stream, I end up getting a sequence of 4 packets of size 1024 followed by 1 packet of size 572. This sequence is repeated for the duration of the file. Why is this happening, and is there any way to ensure a