rtp

Wowza: Need to stream rtp-live to iphone

吃可爱长大的小学妹 提交于 2020-01-04 07:15:43
问题 I need to stream rtp-live to iphone. My channels are configured in SDP files. When I request channels from Flash player, I have a plug-in that starts streaming data to Wowza once it detects the request. This way I don't need to publish the streams. When I try to watch from iphone, I get error since the stream is not published. If I watch a channel from Flash player and then try to watch from iphone, then it works because the stream is automatically published. If I stop watching from Flash

Cannot play RTSP video in VideoView in Samsung Galaxy S2

╄→尐↘猪︶ㄣ 提交于 2020-01-03 12:05:46
问题 I'm trying to play a live RTSP video (from rtsp://media2.tripsmarter.com/LiveTV/BTV/ ) using VideoView , and here's my code: public class ViewTheVideo extends Activity { VideoView vv; @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.main); vv = (VideoView) this.findViewById(R.id.VideoView); Uri videoUri = Uri.parse("rtsp://media2.tripsmarter.com/LiveTV/BTV/"); vv.setMediaController(new MediaController(this)); vv

Android: mpeg4/H.264 packetization example

让人想犯罪 __ 提交于 2020-01-02 09:59:36
问题 I need to split mpeg4 video stream (actually from android video camera) to send it through RTP. The specification is little large for quick reference. I wonder if there any example/open source code for mpeg4 packetization? Thanks for any help ! 回答1: Mpeg4 file format is also called ISO/IEC 14496-14. Google it any you will find specifications. However, what you are trying to do (RTP publisher) will be hard for the following reasons: Mpeg4 has header at the end of the file. Which means header

MPEG4 extract from RTP payload

穿精又带淫゛_ 提交于 2020-01-02 09:19:37
问题 I'm trying to extract mpeg4 from an rtp payload , format of the rtsp media (video) is MP4V-ES but I'm not able to extract the mp4 from the payload . when I dump the extract into a raw file and use ffmpeg to convert it into .avi or .mpg its not working. I don't know what I'm missing here. the code is written in java. I want to extract each video frame from the rtp and save that in a file or retransmit it. Thanks Question UPDATED..... Thanks for the inputs, actually I'm able to extract bytes

iOS 直播类APP开发流程分解:

北战南征 提交于 2019-12-31 00:39:05
1 . 音视频处理的一般流程: 数据采集→数据编码→数据传输(流媒体服务器) →解码数据→播放显示 1、 数据采集: 摄像机及拾音器收集视频及音频数据,此时得到的为原始数据 涉及技术或协议: 摄像机:CCD、CMOS 拾音器:声电转换装置(咪头)、音频放大电路 2、 数据编码: 使用相关硬件或软件对音视频原始数据进行编码处理(数字化)及加工(如音视频混合、打包封装等),得到可用的音视频数据 涉及技术或协议: 编码方式:CBR、VBR 编码格式 视频:H.265、H.264、MPEG-4等,封装容器有TS、MKV、AVI、MP4等 音频:G.711μ、AAC、Opus等,封装有MP3、OGG、AAC等 3、 数据传输: 将编码完成后的音视频数据进行传输,早期的音视频通过同轴电缆之类的线缆进行传输,IP网络发展后,使用IP网络优传输 涉及技术或协议: 传输协议:RTP与RTCP、RTSP、RTMP、HTTP、HLS(HTTP Live Streaming)等 控制信令:SIP和SDP、SNMP等 4、 解码数据: 使用相关硬件或软件对接收到的编码后的音视频数据进行解码,得到可以直接显示的图像/声音 涉及技术或协议: 一般对应的编码器都会带有相应的解码器,也有一些第三方解码插件等 5、 播放显示: 在显示器(电视、监视屏等)或扬声器(耳机、喇叭等)里,显示相应的图像画面或声音

H264 profile-iop explained

跟風遠走 提交于 2019-12-30 19:58:33
问题 Identify h264 profile and level from profile-level-id in sdp? How does one identify what the constraints actually mean? For example I have a profile-type-id: 42801e that translates to: How am I to relate that to the features defined in the table here? The above reference identified that the Constraint_set0_flag: 1 means that it's the Constrained Baseline Profile . But how to relate the flag to the three different NO's (from the table) that differentiate the Baseline profile from the

Reading RTCP packets from an IP camera using FFMPEG

眉间皱痕 提交于 2019-12-30 05:28:10
问题 I am using the ffmpeg C library. I need to intercept RTCP packets from the camera in order to get the timestamp from the Sender Report. Is there any method or structure in the ffmpeg that gives me this information? I am completely stuck but I am not able to solve that problem. Any help will be appreciated. Thanks in advance, 回答1: Finally I had to hack into the ffmpeg library like this: // Patch for retrieving inner ffmpeg private data RTSPState* rtsp_state = (RTSPState*) context->priv_data;

ffmpeg rtp转rtmp

偶尔善良 提交于 2019-12-29 17:42:02
推流 // 本地mp4文件进行RTP推流 ffmpeg -re -i cece_1.mp4 -an -c copy -f rtp rtp://10.0.4.134:11111 > ffmpeg.sdp // 没有音频流 ffmpeg -re -i cece_1.mp4 -an -c copy -f rtp rtp://10.0.4.134:11111 > ffmpeg.sdp // 有音频和视频流 ffmpeg -re -i cece_1.mp4 -vcodec copy -an -f rtp rtp://10.0.4.134:11111 -vn -acodec copy -f rtp rtp://10.0.4.134:11122 > ffmpeg.sdp 播放 ffplay -i ffmpeg.sdp -protocol_whitelist file,udp,rtp 拉流 // 把RTP转录为RTMP ffmpeg -protocol_whitelist file,udp,rtp -i ffmpeg.sdp -vcodec copy -acodec copy -f flv rtmp://pili-publish.xxwolo.com/cece/1111111MjJBNlhZZTBpS?key = cb461460-48ce-46fa-a01f-82f74c395ffd //

Creating RTP Packets from Android Camera to Send

早过忘川 提交于 2019-12-28 03:34:25
问题 I'm new to Android and socket programming. I want to create an android application that transfer video live from device camera to PC. What first i do is to get a raw video data from PreviewCallback arguments and convert it to an RTP Packet. I was just using JLibRTP to do this. Regarding to transfer the packet i think, there are some related class: RtpPkt, RtpSession, and RtpSocket. Here is my glance code: DatagramSocket rtpSocket = new DatagramSocket(); DatagramSocket rtcpSocket = new new

Convert video Input Stream to RTMP

喜欢而已 提交于 2019-12-28 03:18:09
问题 I want to stream video recording from my android phone to network media server. The first problem is that when setting MediaRecorder output to socket, the stream is missing some mdat size headers. This can be fixed by preprocessing that stream locally and adding missing data to stream in order to produce valid output stream. The question is how to proceed from there. How can I go about output that stream as an RTMP stream? 回答1: First, let's unwind your question. As you've surmised, RTMP isn't