rtp

How can I decode rtp packet of some specific rtp payload type as rtp packet with another payload type?

最后都变了- 提交于 2019-12-24 23:23:10
问题 I'm receiving rtp packets with JPEG payload with VLC. When I mannually setup the type to 26 (JPEG), vlc doesn't try open the stream; if I define it as 96, vlc opens it and displays it wrong - that is due to malformed encoding. To find out the correct encoding, i.e to find out what values of packet headers are correct, i want to compare the rtp packets with working example. Surprisingly, example uses payload type 96 instead of 26. I user wireshark to observe the headres; if works fine with

OpenCv + Gstreamer from an app, getting initial 30s delay

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-24 18:56:42
问题 So my application is exposing an RTP stream using new VideoWriter(pipeline-definition); The pipeline definition is: appsrc is-live=1 do-timestamp=1 format=3 stream-type=0 min-latency=0 max-latency=500000000 ! queue leaky=2 max-size-time=500000000 ! videoconvert ! video/x-raw ! x264enc ! h264parse ! rtph264pay config-interval=10 pt=96 ! udpsink host=127.0.0.1 port=9000 The problem I'm faced with is 30s delay in the stream when viewing it in VLC. No matter what I do, VLC is always 29-30s behind

音视频技术之直播架构

时光毁灭记忆、已成空白 提交于 2019-12-24 18:07:03
直播相关知识之一 基本架构 一. 引子-直播基本架构 下面是服务器的整体架构图: 上面上整体流程 相信一个开发者应该可以看的懂并理解吧! 主要分为四部分东西吧: 推流端SDK 负责 采集视频音频进行编码传输到服务端(某云), 服务端SDK负责 直播流的创建,分发到各个cdn节点,加快流的解析,以及各种流的管理统计等等 拉流端SDK负责 拉取流 进行解码解析 进行播放 本业务端负责 相关业务操作 比如授权地址 查询直播列表 等等 直播名词解释 1.推流 将直播内容推送至服务器的过程。 2.拉流 服务器已有直播内容,用指定地址进行拉取的过程。 3.RTMP协议 Real Time Messaging Protocol(实时消息传输协议)的首字母缩写。该协议基于TCP,是一个协议族,包括RTMP基本协议及RTMPT/RTMPS/RTMPE等多种变种。RTMP是一种设计用来进行实时数据通信的网络协议,主要用来在Flash/AIR平台和支持RTMP协议的流媒体/交互服务器之间进行音视频和数据通信。 4.编码: H.264编码 是 高性能的视频编码技术,最大的优势是具有很高的数据压缩比率,能以较低的数据速率传送基于联网协议(IP)的视频流. 5.码率: 码率就是数据传输时单位时间传送的数据位数,一般我们用的单位是kbps即千位每秒。 6.FPS: 帧率(Frame rate

How can I mux/demux RTP media from one stream?

最后都变了- 提交于 2019-12-24 11:39:09
问题 Currently, I'm finding a lib able to stream video from multiple sources through one RTP Stream (one connection). Anbody have sugesstion on it? Actually, I figured out that Opal 3.8 is VoIP lib, supported RTP/H264. But I don't know whether it can support mux/demux rtp media from one stream? If no, can you give me some suggesstion? Thanks, 回答1: There are a few RTP stacks around and which one you use depends on which language you are going to be developing in, pjmedia is a good cross-platform

RTP video stream: is presence of SDP file mandatory?

人盡茶涼 提交于 2019-12-24 02:59:04
问题 I have implemented a raw rtp stream. I want to play it using VLC or Mplayer. But it seems that video players cannot play this stream. For example MPlayer says: Stream not seekable! Stray packet (seq[6]=1013 seq=987, newseq=-26 found at 12) I dont get any idea what I have to do, to make video readable by this videoplayers. Should I add SDP ? Or theese players can play raw rtp stream ? Thanks 回答1: SDP is not required as long as receiver is aware of format of the streams. Stream not seekable

GStreamer: Add dummy audio track to the received rtp stream

﹥>﹥吖頭↗ 提交于 2019-12-24 00:51:58
问题 I'm initiating RTP stream from my Raspberry camera using: raspivid -n -vf -fl -t 0 -w 640 -h 480 -b 1200000 -fps 20 -pf baseline -o - | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay pt=96 config-interval=10 ! udpsink host=192.168.2.3 port=5000 on the client site, I'm converting it to HLS and upload it on a web server: gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,payload=96 ! rtph264depay ! mpegtsmux ! hlssink max-files=5 target-duration=5 location=C:/xampp/htdocs/live/segment%%05d

How to encode h.264 live stream to RTP packet with Java

可紊 提交于 2019-12-23 22:21:52
问题 I am developing an application for Android OS, and I need a real-time decode video stream from the camera, that encoded with h.264 codec, convert frame data to RTP packet and sent packet to server. For a start, may try to implement on PC read video from the pre-recorded video file (mp4 with h.264) from HDD to simplify the development and debugging. Is there a ready-made solution? Any ideas? Thanks! 回答1: See Spydroid. It pipes the camera input into the H.264 encoder and turns the output into

查看PS流中的音视频流编码类型

白昼怎懂夜的黑 提交于 2019-12-23 10:58:18
【推荐】2019 Java 开发者跳槽指南.pdf(吐血整理) >>> 接入平台:海康8700 步骤: 1、抓取RTP包 2、分析RTP流 3、找到PS流的的节目映射流部分 1、抓取RTP包 有的时候我们只能抓到udp包,当过滤项填成“rtp”时,界面就显示空,什么包也抓不到,遇到这种情况,执行如下操作:在udp包上右键-》Decode As...-》RTP即可,可以抓取到rtp包并进行分析了。 2、分析RTP流 找到PS流,点击分析 右键,选择转至分组 跳转到最开始的位置 找到如图所示位置 3、找到PS流的的节目映射流 映射流标识字段 map_stream_id ,8位字段,值为0xBC 来源: oschina 链接: https://my.oschina.net/u/4430469/blog/3146104

How to convert H.264 UDP packets to playable media stream or file (defragmentation)

允我心安 提交于 2019-12-23 10:54:08
问题 I am missing some fundamental thing in translating an UDP stream of a SDP session into a decodable H.264 stream. I am testing with a H.264 capable camera and can play the stream with a player directly. When I try to play the translated stream it will not be recognized by the player (missing header error). However I have to decode the UDP stream to be able to integrate this in a Java application for which there are some decoders around. I have seen very good answers to following questions

Does android support APIs for implementing RTP,RTSP for VoIP and PTT Project?

自古美人都是妖i 提交于 2019-12-23 10:39:31
问题 I am going to make a PTT project on Android. Could you tell me how deep Android supports Voice and Multimedia API (such as RTP,RTSP,VoIP) for developers? 回答1: MediaPlayer supports playing rtsp://.. URLs. Audio and Video are supported. Check media format support to see which codecs are supported. MediaPlayer internally automatically handles RTSP and RTP, so there is not much you need to handle. OTOH it does not give any low-level control over this process. About VoIP: Android only consumes