rtsp

android stream real time video to streaming server

瘦欲@ 提交于 2021-02-10 20:37:56
问题 I am trying to develop a half duplex chat application where one android device will capture video from cam and send this real time video stream to server. Thereafter, another android device is capturing the same video stream from the streaming server and rendering the video on screen. I successfully did this using WebRTC API, Vitamio API and wowza streaming server. But, this method is resulting in a very poor video quality at receiver's end with poor video quality and huge latency even on

android stream real time video to streaming server

限于喜欢 提交于 2021-02-10 20:34:45
问题 I am trying to develop a half duplex chat application where one android device will capture video from cam and send this real time video stream to server. Thereafter, another android device is capturing the same video stream from the streaming server and rendering the video on screen. I successfully did this using WebRTC API, Vitamio API and wowza streaming server. But, this method is resulting in a very poor video quality at receiver's end with poor video quality and huge latency even on

C++ Video Stream detect FPS

核能气质少年 提交于 2021-02-10 18:47:06
问题 I try to get the correct fps of a video stream from an axis or eneo camera. rtsp://192.168.0.1:554/axis-media/media.amp I use cv::VideoCapture::get(CV_CAP_PROP_FPS); but with some cameras the result is invalid for example the result is 180000 but the correct value is 25. I have check it with wireshark and see the value in the SDP Protocol is correct. Media Attribute (a): framerate:25.000000 Which Information cv::VideoCapture::get read? 回答1: OpenCV Isn't very good at this sort of thing, and

What would cause a ConnectionReset on an UDP socket?

两盒软妹~` 提交于 2021-02-08 03:34:10
问题 I'm trying to work with the Managed Media Aggregation C# library (http://net7mma.codeplex.com) to handle a RTSP/RTP stream from a Freebox Set top box. Although the lib works fine with the sample RTSP feed, when working with the feed from my set top box, the RTP listener socket (a simple UDP socket listening every income on a specific port) throws a SocketException : ConnectionReset, and of course no data shows while Receiving (The data shows in Wireshark). Suppressing E_CONNRESET via the

Write image frames into gstreamer rtp pipeline

生来就可爱ヽ(ⅴ<●) 提交于 2021-01-29 17:38:24
问题 I am trying to use the gstreamer pipeline to view an rtp stream in vlc on my computer. I mostly looked into this thread. My end result is something like this #!/usr/bin/env python import gi import numpy as np gi.require_version('Gst', '1.0') from gi.repository import Gst, GObject import time class RtpPipeline(object): def __init__(self): self.number_frames = 0 self.fps = 30 self.duration = 1 / self.fps * Gst.SECOND # duration of a frame in nanoseconds self.launch_string = 'appsrc name=source

Handling errors with gst-rtsp-server Python bindings

ⅰ亾dé卋堺 提交于 2021-01-29 11:30:41
问题 I have a simple Python program creates an RTSP stream using gst-rtsp-server. It works, but as-is there's no error handling. If the pipeline has a typo or there's some issue connecting to the video source, I don't see a stack trace or any logging. Where would I hook in code to handle problems like this? I should mention that I'm a complete beginner to the GObject world. I suspect there is a standard way for these libraries to report errors but I haven't been able to find anything in the

What is the max rtsp(over tcp) packet size?

只谈情不闲聊 提交于 2021-01-29 05:53:00
问题 I didn't see about it anything in Real Time Streaming Protocol (RTSP) , but when I sniffing , I saw the max rtsp packet size is 1440. And like you can see here RTSP - RTP over TCP RTP Data After the setup, RTP data will be sent through the TCP socket that is used for RTSP commands. The RTP data will be encapsulate in the following format | magic number | channel number | embedded data length | data | magic number - 1 byte value of hex 0x24 channel number - 1 byte value to denote the channel

GStreamer - RTSP to HLS / mp4

穿精又带淫゛_ 提交于 2021-01-28 08:06:42
问题 I try to save RTSP h.264 stream to HLS mp4 files: gst-launch-1.0 rtspsrc location="rtsp://....." ! rtph264depay ! h264parse ! matroskamux ! hlssink max-files=0 playlist-length=0 location="/home/user/ch%05d.mp4" playlist-location="/home/user/list.m3u8" target-duration=15 As a result - there is only one file ch00000.mp4, which includes the whole videostream (3min instead of 15sec in "target-duration"). If I save to mpegtsmux / ts files - all is ok for the same command. What is wrong? Thanks in

ffmpeg multiple rtsp cameras into sigle stream to youtube

北城余情 提交于 2021-01-27 15:01:27
问题 I have two rtsp ip cameras (dlink) and I want combine (merge) 2 stream in one video output and put it to yutube (live streaming). My first step is ok and my command is: ffmpeg -i "rtsp://xxxxxx:xxxxxx@192.168.1.164/live2.sdp" -i "rtsp://xxxxxx:xxxxxx@192.168.1.164/live2.sdp" -filter_complex " nullsrc=size=1600x448 [base]; [0:v] setpts=PTS-STARTPTS, scale=800x448 [upperleft]; [1:v] setpts=PTS-STARTPTS, scale=800x448 [upperright]; [base][upperleft] overlay=shortest=1 [base]; [base][upperright]

rtsp:// liveStream with AVPlayer

两盒软妹~` 提交于 2021-01-27 10:33:05
问题 I want to play liveStream on iPhoneDevice with AVPlayer. Also i want to get CVPixelBufferRef from this stream for next usage. I use Apple guide for creating player. Currently with locally stored videoFiles this player works just fine, also when i try to play this AppleSampleStremURL - http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8 - its work sine too. Problems appear when i want to play stream with rtsp:// like this one: rtsp://192.192.168.1:8227/TTLS/Streaming/channels/2