rtp

How implement the VOIP application using android.net.rtp package

耗尽温柔 提交于 2019-11-28 19:20:05
问题 I am trying to implemented the VoIP application using the AudioGroup and AudioStream classes of the android.net.rtp package. But my application not function properly. After "Join" the "AudioGroup" class object with the "AudioStream" object, its send udp packets successfully. I checked that using the packet analyzer. But voice is not hear from the phone. I run my application in 2 phones and try communicate voice between them. In below I mention my source code. public class MainActivity extends

Stream H.264 video over rtp using gstreamer

主宰稳场 提交于 2019-11-28 18:09:44
问题 I am newbie with gstreamer and I am trying to be used with it. My first target is to create a simple rtp stream of h264 video between two devices. I am using these two pipelines: Sender: gst-launch-1.0 -v filesrc location=c:\\tmp\\sample_h264.mov ! x264enc ! rtph264pay ! udpsink host=127.0.0.1 port=5000 Receiver: gst-launch-1.0 -v udpsrc port=5000 ! rtpmp2tdepay ! decodebin ! autovideosink But with the first one (the sender) I got the following error: Setting pipeline to PAUSED ... Pipeline

H.264 over RTP/RTSP (iPhone)

ぃ、小莉子 提交于 2019-11-28 17:07:05
Is it possible to view video stream (H.264 live feed) over RTP/RTSP in iPhone natively? If not, is it possible to write an application for this purpose? Grodriguez Sure it is possible, in fact others have done it already, see e.g. IPVision from TTrix . Seems that the Live555 libraries can be compiled for iOS. See this SO question: How to configure live555 framework for iphone app devleopment? Sure it is possible now. Previously, RTSP is not allowed in AppStore, though someone migrated VLC to iPhone, but that app is removed from AppStore, you can only install it via jailbreak. Now Apple doesn't

Receiving RTP packets after RTSP setup

不问归期 提交于 2019-11-28 16:53:28
问题 I'm trying to stream RTP packets from an IP camera using Python. I am able to send the describe, setup & play commands using RTSP protocol, however, I am unable to start streaming the actual videostream using RTP. Here is the code: import socket def printrec(recst): recs=recst.split('\r\n') for rec in recs: print rec dest="DESCRIBE rtsp://admin:12345@192.168.1.74 RTSP/1.0\r\nCSeq: 2\r\nUser-Agent: python\r\nAccept: application/sdp\r\n\r\n" setu="SETUP rtsp://admin:12345@192.168.1.74/trackID=1

Problem to Decode H264 video over RTP with ffmpeg (libavcodec)

删除回忆录丶 提交于 2019-11-28 16:15:52
I set profile_idc, level_idc, extradata et extradata_size of AvCodecContext with the profile-level-id et sprop-parameter-set of the SDP. I separate the decoding of Coded Slice, SPS, PPS and NAL_IDR_SLICE packet : Init: uint8_t start_sequence[]= {0, 0, 1}; int size= recv(id_de_la_socket,(char*) rtpReceive,65535,0); Coded Slice : char *z = new char[size-16+sizeof(start_sequence)]; memcpy(z,&start_sequence,sizeof(start_sequence)); memcpy(z+sizeof(start_sequence),rtpReceive+16,size-16); ConsumedBytes = avcodec_decode_video(codecContext,pFrame,&GotPicture,(uint8_t*)z,size-16+sizeof(start_sequence))

What is the difference between RTP or RTSP in a streaming server?

↘锁芯ラ 提交于 2019-11-28 14:43:56
问题 I'm thinking about developing a streaming server and I have the following question, do over RTSP (example url: rtsp://192.168.0.184/myvideo.mpg ) or RTP (example url: rtp://192.168.0.184 ). As I have understood, an RTSP server is mainly used for streaming of files that already exist, ie, not live. RTP server is used to broadcast. Somebody correct me if I'm wrong, am I right?. What I want to develop a server to broadcast live content on the computer screen, that is, which is displayed at the

Minimum SDP for making a H264 RTP stream?

只谈情不闲聊 提交于 2019-11-28 09:22:19
I'm looking for an example of a minimum necessary SDP for setting up a H264 video stream.| The assumption is that the receiver can play H264 as long as it gets the required parameters through SDP. I have found a related document here however it uses lots of optional parameters in the examples, and I'm looking for the bare required minimum. TheMeaningfulEngineer Here is the bares minimum SDP. It is a file called test.sdp which has the following content: c=IN IP4 10.5.110.117 m=video 5004 RTP/AVP 96 a=rtpmap:96 H264/90000 I've started the stream on a virtual machine using VLC. (No SDP sent here)

Stream RTP to FFMPEG using SDP

妖精的绣舞 提交于 2019-11-28 01:36:39
问题 I get RTP stream from WebRTC server (I used mediasoup) using node.js and I get the decrypted RTP packets raw data from the stream. I want to forward this RTP data to ffmpeg and from there I can save it to file, or push it as RTMP stream to other media servers. I guess that the best way would be to create SDP file that describes both the audio and video streams and send the packets through new sockets. The ffmpeg command is: ffmpeg -loglevel debug -protocol_whitelist file,crypto,udp,rtp -re

How to force client to switch RTP transport from UDP to TCP?

不想你离开。 提交于 2019-11-27 21:29:19
问题 If the client wants to watch a stream that is on my RTSP server, it first tries to setup a stream through the UDP protocol. How can I tell it that my server only supports RTP/AVP/TCP and that it should switch transports? I want to terminate the UDP support on my server, but all the clients first try to SETUP the session over UDP, and later they do so over TCP... and I want to switch them to TCP as soon as possible in RTSP protocol. How can I do that? 回答1: As far as I know, there is no control

How to use AVSampleBufferDisplayLayer in iOS 8 for RTP H264 Streams with GStreamer?

大憨熊 提交于 2019-11-27 17:54:48
After getting notice of the HW-H264-Decoder being available to programmers in iOS 8, I want to use it now. There is a nice introduction to 'Direct Access to Video Encoding and Decoding' from WWDC 2014 out there. You can take a look here . Based on Case 1 there, I started to develop an Application, that should be able to get an H264-RTP-UDP-Stream from GStreamer, sink it into an 'appsink'-element to get direct access to the NAL Units and do the conversion to create CMSampleBuffers, which my AVSampleBufferDisplayLayer can display then. The interesting piece of code doing all that is the