video-streaming

My Android MediaPlayer synchronization problems

主宰稳场 提交于 2019-12-06 04:59:03
问题 I have a simple Android media player that can play multiple videos simultaneously on a single screen. So basically a single media player screen is divided into 4 parts, with 4 mediaPlayer instance glued together, and each part plays a given video. It works almost OK when my video files are stored locally on the device. There are synchronization problems, but minor. But when I input a URL for HTTP streaming, there is significant synchronization problems. What is the problem? Generally, how can

ONVIF : How to form the device web service address from the IP address of an NVT

断了今生、忘了曾经 提交于 2019-12-06 04:08:09
问题 My question is about the ONVIF specification. http://www.onvif.org/imwp/download.asp?ContentID=18006 In section 5.10, it says : A service is a collection of related ports. This specification does not mandate any service naming principles. Lets say that I have the IP address of an NVT (Network Video Transmitter like an IP camera for example), how do I form the address of the device management web service? This service is the entry point of the whole system. Thank you. 回答1: According to the

Where to get live video streaming examples ( GStreamer )? [closed]

孤街醉人 提交于 2019-12-06 02:26:36
问题 As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance. Closed 7 years ago . Where to get live video + audio streaming examples ( GStreamer )? So for example streaming from File or Web camera to some web address

How to do rtmp streaming in kurento media server?

回眸只為那壹抹淺笑 提交于 2019-12-06 00:42:56
I was just wondering if there is any feature like RTMP in Kurento Media Server . I need it to stream my vod content . Any ideas ? anyhow RTP can be used for it ? Thanks Pawan There isn't a rtmp endpoint in Kurento, at least yet. But we have streamed content to a Wowza mediaserver using an RTP Endpoint with the last kurento development version. Maybe this can also work for you. 来源: https://stackoverflow.com/questions/27203318/how-to-do-rtmp-streaming-in-kurento-media-server

How to play youtube video without UIWebView or detect video player when youtube video start playing using webview?

帅比萌擦擦* 提交于 2019-12-06 00:35:25
问题 I need to play youtube video in my iOS application and while video is playing I need to add an overlay on video. 1.Now how can i run youtube video in native player? 2.if I play video in UIWebview then how can i detect that video is playing and how to add overlay on the video? 回答1: MPMoviePlayerViewController cannot directly play youtube video urls.You need to extract the url before loading MPMoviePlayerViewController with the url. you can see a working demo in my github repo: Try this:: https

Is it possible to create new mp4 file from a single streaming byte range chunk?

女生的网名这么多〃 提交于 2019-12-06 00:24:55
问题 If I have a remote mp4 file on a server that supports Byte Ranges, is it possible to retrieve a single byte range and create a new/self-contained mp4 from that range data? If I try and write a returned byte range data directly to an mp4 file using fs.createWriteStream(remoteFilename) it doesn't get the video meta data (duration, dimensions, etc) that it needs to be playable. When I get a byte range that starts with 0 and ends with XX the output mp4 is playable, but will have the duration meta

gStreamer Video Recording Memory Leak

有些话、适合烂在心里 提交于 2019-12-06 00:22:43
HI I am trying to record rtsp stream coming from camera(H264 format). I am using following gst command to do recording in MPEG4 Format gst-launch -e rtspsrc location=rtsp://10.17.8.136/mediainput/h264 latency=100 ! decodebin ! ffenc_mpeg4 ! avimux ! filesink location=test.mp4 and H264 format gst-launch-0.10 -e rtspsrc location="rtsp://10.17.8.136/mediainput/h264" latency=100 ! rtph264depay byte-stream=false ! capsfilter caps="video/x-h264,width=1920,height=1080,framerate=(fraction)25/1" ! mp4mux ! filesink location=testh264.mp4 Both are doing recording but i have observed that There is RAM

Playing webm chunks as standalone video

一笑奈何 提交于 2019-12-05 22:36:34
I've built some code that will get the MediaRecorder API to capture audio and video, and then use the ondataavailable function to send the corresponding webm file blobs up to a server via websockets. The server then sends those blobs to a client via websockets which puts the video together in a buffer using the Media Source Extension API. This works well, except that if I want to start a stream partway through, I can't just send the latest blob because the blob by itself is unplayable. Also, if I send the blobs out of order the browsers usually complain that the audio encoding doesn't match up

Decode of live RTSP stream: large video lag using MediaPlayer on Android

僤鯓⒐⒋嵵緔 提交于 2019-12-05 22:06:48
问题 I'm playing a Live RTSP stream from VLC on a PC to Android MediaPlayer class (both on same local network). It plays smoothly with no errors - the problem is that the decoded video on screen is between around 5 and 7 seconds behind live. From debug and callbacks I can see that the live data is arriving on the device < 1s after starting mMediaPlayer.prepareAsync() . This is when the MediaPlayer class begins to work out what format the stream is with what dimensions etc. Then just before video

Video Streaming Over Websockets

非 Y 不嫁゛ 提交于 2019-12-05 21:33:44
I am trying to build mobile app which can stream video from both sides(i.e something like video calling). I looked into webrtc, but thats not yet ready for mobile native apps as such and anyways what webrtc was doing was allowing browser to capture camera and audio directly without requiring plugins etc. But in native mobile apps capturing camera and audio isn't a issue and basically a very low latency and dual transport layer is needed. In many articles and places I read about using webrtc over websockets. So I thought I can stream the video using websockets. Is it correct or am I missing