rtp

Receiving RTP packets after RTSP setup

微笑、不失礼 提交于 2019-11-29 20:21:14
I'm trying to stream RTP packets from an IP camera using Python. I am able to send the describe, setup & play commands using RTSP protocol, however, I am unable to start streaming the actual videostream using RTP. Here is the code: import socket def printrec(recst): recs=recst.split('\r\n') for rec in recs: print rec dest="DESCRIBE rtsp://admin:12345@192.168.1.74 RTSP/1.0\r\nCSeq: 2\r\nUser-Agent: python\r\nAccept: application/sdp\r\n\r\n" setu="SETUP rtsp://admin:12345@192.168.1.74/trackID=1 RTSP/1.0\r\nCSeq: 3\r\nUser-Agent: python\r\nTransport: RTP/AVP;unicast;client_port=60784-60785\r\n\r

RTP on Android MediaPlayer

廉价感情. 提交于 2019-11-29 13:06:41
问题 I've implemented RTSP on Android MediaPlayer using VLC as rtsp server with this code: # vlc -vvv /home/marco/Videos/pippo.mp4 --sout #rtp{dst=192.168.100.246,port=6024-6025,sdp=rtsp://192.168.100.243:8080/test.sdp} and on the Android project: Uri videoUri = Uri.parse("rtsp://192.168.100.242:8080/test.sdp"); videoView.setVideoURI(videoUri); videoView.start(); This works fine but if I'd like also to play live stream RTP so I copied the sdp file into the sdcard (/mnt/sdcard/test.sdp) and setting

Android 4.1 - RTSP using VideoView and MediaController

蹲街弑〆低调 提交于 2019-11-29 12:35:06
问题 Developing a simple app to play a RTSP stream on Android 4.1, but unable to do so Update I am able Able to play if I use BigBuckBunny_115k.mov Uri video = Uri.parse("rtsp://184.72.239.149/vod/mp4:BigBuckBunny_115k.mov"); BUT I tried with lot of RTSP streams mentioned here and here, but none worked :( ****Problem: I could not see any stream on my phone, only black screen is visible.. After some time, a dialog box appears "Can't play this video". I tried with many RTSP streams, but same result,

sdp协议总结

a 夏天 提交于 2019-11-29 08:21:35
一 概述: SDP目的就是在媒体会话中,传递媒体流信息,允许会话描述的接收者去参与会话。SDP基本上在internet上工作。他定义了会话描述的统一格式,但并不定义多播地址的分配和SDP消息的传输,也不支持媒体编码方案的协商,这些功能均由下层传送协议完成。典型的会话传送协议包括:SAP(Session Announcement Protocol会话公告协议),SIP(Session Initiation Protocol,会话初始协议),RTSP,HTTP,和使用MIME的E-Mail。 SDP包括以下一些方面: (1)会话的名称和目的 (2)会话存活时间 (3)包含在会话中的媒体信息,包括: 媒体类型(video,audio, etc) 传输协议(RTP/UDP/IP,H.320, etc) 媒体格式(H.261video, MPEG video, etc) 多播或远端(单播)地址和端口 (4)为接收媒体而需的信息(addresses, ports, formats and so on) (5)使用的带宽信息 (6)可信赖的接洽信息(Contact information) 二 sdp规格: SDP会话描述由多行=组成。其中是一个字符。是一个字符串,其格式视而定。整个协议区分大小写。“=”两侧不允许有空格。 SDP会话描述由一个会话级描述(session_level

Stream RTP to FFMPEG using SDP

♀尐吖头ヾ 提交于 2019-11-29 07:58:30
I get RTP stream from WebRTC server (I used mediasoup ) using node.js and I get the decrypted RTP packets raw data from the stream. I want to forward this RTP data to ffmpeg and from there I can save it to file, or push it as RTMP stream to other media servers. I guess that the best way would be to create SDP file that describes both the audio and video streams and send the packets through new sockets. The ffmpeg command is: ffmpeg -loglevel debug -protocol_whitelist file,crypto,udp,rtp -re -vcodec libvpx -acodec opus -i test.sdp -vcodec libx264 -acodec aac -y output.mp4 I tried to send the

iOS 直播类APP开发流程解析

纵然是瞬间 提交于 2019-11-29 03:42:21
1 . 音视频处理的一般流程: 数据采集→数据编码→数据传输(流媒体服务器) →解码数据→播放显示 1、数据采集: 摄像机及拾音器收集视频及音频数据,此时得到的为原始数据 涉及技术或协议: 摄像机:CCD、CMOS 拾音器:声电转换装置(咪头)、音频放大电路 2、数据编码: 使用相关硬件或软件对音视频原始数据进行编码处理(数字化)及加工(如音视频混合、打包封装等),得到可用的音视频数据 涉及技术或协议: 编码方式:CBR、VBR 编码格式 视频:H.265、H.264、MPEG-4等,封装容器有TS、MKV、AVI、MP4等 音频:G.711μ、AAC、Opus等,封装有MP3、OGG、AAC等 3、数据传输: 将编码完成后的音视频数据进行传输,早期的音视频通过同轴电缆之类的线缆进行传输,IP网络发展后,使用IP网络优传输 涉及技术或协议: 传输协议:RTP与RTCP、RTSP、RTMP、HTTP、HLS(HTTP Live Streaming)等 控制信令:SIP和SDP、SNMP等 4、解码数据: 使用相关硬件或软件对接收到的编码后的音视频数据进行解码,得到可以直接显示的图像/声音 涉及技术或协议: 一般对应的编码器都会带有相应的解码器,也有一些第三方解码插件等 5、播放显示: 在显示器(电视、监视屏等)或扬声器(耳机、喇叭等)里,显示相应的图像画面或声音 涉及技术或协议:

How to fragment H264 Packets in RTP compliant with RFC3984

橙三吉。 提交于 2019-11-29 02:52:28
问题 I have the FFMPEG streaming baseline h264 video, which I have to encapsulate in RTP and send to SIP phones for their decoding. I am using Linphone with the h264 plugin for Windows and Mirial for the decoding progress. However, sometimes I get a huge frame size (3Kb ~ 9Kb) from the FFMPEG, which obviously doesn't fit in the MTU. If I send these frames "as is" and trusting IP fragmentation feature, some phones are able to play it well enough, but others choke and can't decode the stream. I

Live555 on Android

南笙酒味 提交于 2019-11-29 02:30:16
I'm trying to get the RTSP video stream play in my Android App using the build-in Videoview/MediaPlayer, but there're always various problems on different ROMs or different network status(UDP packets blocked), it's really annoying so I want to implement my own rtsp client with the live555 source and GLES and ffmpeg. I can figure out how to use ffmpeg and GLES to show a video, but I'm not familiar with live555. Are there any compiled version of live555 on Android? or how could I do that myself? Thanks. I think I found a sample code from github, it works for me. bad news - I think you won't find

Sending WebRTC MediaStream over Websocket (RTP over HTTP/Websocket)

风格不统一 提交于 2019-11-29 02:14:47
WebRTC is, among other things, meant for real-time browser to browser media communication, but in my case it will be used for browser to server audio communication. From the information I've gathered, the MediaStream is transferred using RTP over UDP. This will require at least two additional ports apart from the protocol used for signalling, something I would like to avoid. Within WebRTC, is there any possibility to use RTP over Websocket instead of RTP over UDP so that I only need to use port 80 or 443? No, that will not be possible using WebRTC. WebRTC was built to give browsers three main

Why Does RTP use UDP instead of TCP?

蹲街弑〆低调 提交于 2019-11-28 22:16:36
问题 I wanted to know why UDP is used in RTP rather than TCP ?. Major VoIP Tools used only UDP as i hacked some of the VoIP OSS. 回答1: As DJ pointed out, TCP is about getting a reliable data stream, and will slow down transmission, and re-transmit corrupted packets, in order to achieve that. UDP does not care about reliability of the communication, and will not slow down or re-transmit data. If your application needs a reliable data stream, for example, to retrieve a file from a webserver, you