rtp

Detect MPEG4/H264 I-Frame (IDR) in RTP stream

坚强是说给别人听的谎言 提交于 2019-11-27 17:13:28
I need to detect MPEG4 I-Frame in RTP packet. I know how to remove RTP header and get the MPEG4 frame in it, but I can't figure out how to identify the I-Frame. Does it have a specific signature/header? Ok so I figured it out for h264 stream. How to detect I-Frame: remove RTP header check the value of the first byte in h264 payload if the value is 124 (0x7C) it is an I-Frame I cant figure it out for the MPEG4-ES stream... any suggestions? EDIT: H264 IDR This works for my h264 stream ( fmtp:96 packetization-mode=1; profile-level-id=420029; ). You just pass byte array that represents the h264

Sending WebRTC MediaStream over Websocket (RTP over HTTP/Websocket)

大城市里の小女人 提交于 2019-11-27 16:30:20
问题 WebRTC is, among other things, meant for real-time browser to browser media communication, but in my case it will be used for browser to server audio communication. From the information I've gathered, the MediaStream is transferred using RTP over UDP. This will require at least two additional ports apart from the protocol used for signalling, something I would like to avoid. Within WebRTC, is there any possibility to use RTP over Websocket instead of RTP over UDP so that I only need to use

VoIP RTP Streaming from/to server (in Java) to/from android

ε祈祈猫儿з 提交于 2019-11-27 16:09:49
My target is to have a push-to-talk chat app in GSM/UMTS/LTE networks; initially I wanted use multicast addresses and peer-to-peer without overload the server; unfortunatly, after deep investigation, I discovered that multicast is not allowed in GSM/UMTS/LTE networks, therefore I have to use the server in order to bounce the VoIP packets. I' don't like very much this solution because I have to overload the server, but I didn't find any better solution. If you have an alternative solution is very much apprieciated... Therefore I have to send VoIP from an android client to a server (PC), and

Creating RTP Packets from Android Camera to Send

点点圈 提交于 2019-11-27 11:43:16
I'm new to Android and socket programming. I want to create an android application that transfer video live from device camera to PC. What first i do is to get a raw video data from PreviewCallback arguments and convert it to an RTP Packet. I was just using JLibRTP to do this. Regarding to transfer the packet i think, there are some related class: RtpPkt, RtpSession, and RtpSocket. Here is my glance code: DatagramSocket rtpSocket = new DatagramSocket(); DatagramSocket rtcpSocket = new new DatagramSocket(); RtpSession rtpSession = new RtpSession(rtpSocket, rtcpSocket); public void

Convert video Input Stream to RTMP

我的未来我决定 提交于 2019-11-27 10:16:46
I want to stream video recording from my android phone to network media server. The first problem is that when setting MediaRecorder output to socket, the stream is missing some mdat size headers. This can be fixed by preprocessing that stream locally and adding missing data to stream in order to produce valid output stream. The question is how to proceed from there. How can I go about output that stream as an RTMP stream? MrGomez First, let's unwind your question. As you've surmised, RTMP isn't currently supported by Android. You can use a few side libraries to add support, but these may not

H.264 over RTP/RTSP (iPhone)

守給你的承諾、 提交于 2019-11-27 10:11:33
问题 Is it possible to view video stream (H.264 live feed) over RTP/RTSP in iPhone natively? If not, is it possible to write an application for this purpose? 回答1: Sure it is possible, in fact others have done it already, see e.g. IPVision from TTrix. Seems that the Live555 libraries can be compiled for iOS. See this SO question: How to configure live555 framework for iphone app devleopment? 回答2: Sure it is possible now. Previously, RTSP is not allowed in AppStore, though someone migrated VLC to

Problem to Decode H264 video over RTP with ffmpeg (libavcodec)

旧街凉风 提交于 2019-11-27 09:38:19
问题 I set profile_idc, level_idc, extradata et extradata_size of AvCodecContext with the profile-level-id et sprop-parameter-set of the SDP. I separate the decoding of Coded Slice, SPS, PPS and NAL_IDR_SLICE packet : Init: uint8_t start_sequence[]= {0, 0, 1}; int size= recv(id_de_la_socket,(char*) rtpReceive,65535,0); Coded Slice : char *z = new char[size-16+sizeof(start_sequence)]; memcpy(z,&start_sequence,sizeof(start_sequence)); memcpy(z+sizeof(start_sequence),rtpReceive+16,size-16);

Display RTP MJPEG

我只是一个虾纸丫 提交于 2019-11-27 07:24:08
问题 I'm looking for a solution to display a RTP JPEG stream with JavaFx. I can display jpeg from a file and receive RTP JPEG stream and split it to identify all parameters and data as specify in RFC2435 But I don't know how to convert my JPEG arrays to a displayable Image. I dont want to implement a JPEG Decoder by myself. Any idea? 回答1: Leverage JavaFX's built-in jpeg decoder, which should be able to decode the jpeg images in the Image constructor. class MJPEGViewer extends ImageView {

Minimum SDP for making a H264 RTP stream?

随声附和 提交于 2019-11-27 01:57:49
问题 I'm looking for an example of a minimum necessary SDP for setting up a H264 video stream.| The assumption is that the receiver can play H264 as long as it gets the required parameters through SDP. I have found a related document herehowever it uses lots of optional parameters in the examples, and I'm looking for the bare required minimum. 回答1: Here is the bares minimum SDP. It is a file called test.sdp which has the following content: c=IN IP4 10.5.110.117 m=video 5004 RTP/AVP 96 a=rtpmap:96

How to use AVSampleBufferDisplayLayer in iOS 8 for RTP H264 Streams with GStreamer?

天大地大妈咪最大 提交于 2019-11-26 19:15:53
问题 After getting notice of the HW-H264-Decoder being available to programmers in iOS 8, I want to use it now. There is a nice introduction to 'Direct Access to Video Encoding and Decoding' from WWDC 2014 out there. You can take a look here. Based on Case 1 there, I started to develop an Application, that should be able to get an H264-RTP-UDP-Stream from GStreamer, sink it into an 'appsink'-element to get direct access to the NAL Units and do the conversion to create CMSampleBuffers, which my