video-streaming

How to get MJPG stream video from android IPWebcam using opencv

落花浮王杯 提交于 2019-12-04 08:42:39
问题 I am using the IP Webcam program on android and receiving it on my PC by WiFi. What I want is to use opencv in Visual Studio, C++, to get that video stream, there is an option to get MJPG stream by the following URL: http://MyIP:port/videofeed How to get it using opencv? 回答1: Old question, but I hope this can help someone (same as my answer here) OpenCV expects a filename extension for its VideoCapture argument, even though one isn't always necessary (like in your case). You can "trick" it by

Where to get live video streaming examples ( GStreamer )? [closed]

徘徊边缘 提交于 2019-12-04 08:35:35
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance. Closed 7 years ago . Where to get live video + audio streaming examples ( GStreamer )? So for example streaming from File or Web camera to some web address This page contains a few samples on how to do RTP streaming with GStreamer. It's not clear from your

ONVIF : How to form the device web service address from the IP address of an NVT

元气小坏坏 提交于 2019-12-04 08:31:18
My question is about the ONVIF specification. http://www.onvif.org/imwp/download.asp?ContentID=18006 In section 5.10, it says : A service is a collection of related ports. This specification does not mandate any service naming principles. Lets say that I have the IP address of an NVT (Network Video Transmitter like an IP camera for example), how do I form the address of the device management web service? This service is the entry point of the whole system. Thank you. Şafak According to the official document (section 5.1.1), you can access the service at http://<IP address>/onvif/device_service

Streaming Video From Android

故事扮演 提交于 2019-12-04 07:57:02
问题 I'm trying to stream video from the Android phone, which should be watched in an mediaplayer. I've been looking at http://www.mattakis.com/blog/kisg/20090708/broadcasting-video-with-android-without-writing-to-the-file-system which seems to be a dead end since it send the raw file data, and not a streamable format. Then I tried using some code from SipDroid, more specific; parts of VideoCamera.java, RtpPacket.java and RtpSocket.java, which gives a stream on UDP, however these is not playable

Multiple Video Files Simultaneously Playing in Android

无人久伴 提交于 2019-12-04 07:55:25
I had asked the same question for iOS on iPad but now I am trying to see if it's possible within Android. The response I received so far is no within iOS. If it's possible in Android please explain what API is used. Here's my original question referenced: Original Posting on iOS for Multiple Videos Playing Simultaneously on an iPad I tried to do so(2 VideoViews), but only one video played. It is because of linux decoder, which may be used as single instance only(from stack trace info). For now, to achieve multiple videos playback I try to use FFmpeg as video decoder and OpenGL for render

Live streaming video latency

匆匆过客 提交于 2019-12-04 07:03:01
Trying to determine what's "most" responsible for latency - the round trip my video makes from my encoder, to my server, and back down to the player in my browser. I'm at about 12 seconds right now with a player I like. Is it buffering in my player? Buffering on the way out by FMLE? The reason I ask is I feel I've eliminated other culprits with my little test scenario outlined below. And also, all else equal, swapping other players in produces the greatest variance in the latency. One takes it down to 4 seconds. Can't get any lower than that though. Eliminating other culprits: -Bad network?

On the fly Stream and transcode video with Django

随声附和 提交于 2019-12-04 06:30:47
问题 I have a model that uses "models.FileField()", which I then display back to the user so they may click the link and have a file rendered in their browser. The user can upload various types of files. Problem is, I'd like to handle large avi's differently, and have the file stream to the user. The requirement I have is to simply stream/transcode video files from the media_root dir to an end user's browser, preferably in a mac friendly format. It would be for a couple users at most. I've search

Converting html5 video - what software to use

雨燕双飞 提交于 2019-12-04 06:23:43
问题 we're planning to use a fullscreen html5 video on a website. I've read that MPEG-4/H.264 might be the best format at the moment. I have the video file available as 1080p mp4 … it's 41.2mb in size. Since the video should play in "relative" good quality and stream really fast, how can i optimize the video file. Any tips, tricks for me? is 1080p needed for a fullscreen video on desktop or is 720p enough? What should the output size of a fullscreen video for desktop be? Regards, matt 回答1: a lot

How to play youtube video without UIWebView or detect video player when youtube video start playing using webview?

爷,独闯天下 提交于 2019-12-04 06:01:15
I need to play youtube video in my iOS application and while video is playing I need to add an overlay on video. 1.Now how can i run youtube video in native player? 2.if I play video in UIWebview then how can i detect that video is playing and how to add overlay on the video? MPMoviePlayerViewController cannot directly play youtube video urls.You need to extract the url before loading MPMoviePlayerViewController with the url. you can see a working demo in my github repo: Try this:: https://github.com/DpzAtMicRO/IOSYoutubePlayer LBYoutubeExtractor extracts the youtube Url with JSON. Please do

Is it possible to create new mp4 file from a single streaming byte range chunk?

旧城冷巷雨未停 提交于 2019-12-04 05:42:31
If I have a remote mp4 file on a server that supports Byte Ranges, is it possible to retrieve a single byte range and create a new/self-contained mp4 from that range data? If I try and write a returned byte range data directly to an mp4 file using fs.createWriteStream(remoteFilename) it doesn't get the video meta data (duration, dimensions, etc) that it needs to be playable. When I get a byte range that starts with 0 and ends with XX the output mp4 is playable, but will have the duration meta-data of the entire video length and will freeze the screen when the byte range is done for the