video-streaming

DXGI Desktop Duplication: encoding frames to send them over the network

为君一笑 提交于 2019-12-03 01:32:59
I'm trying to write an app which will capture a video stream of the screen and send it to a remote client. I've found out that the best way to capture a screen on Windows is to use DXGI Desktop Duplication API (available since Windows 8). Microsoft provides a neat sample which streams duplicated frames to screen. Now, I've been wondering what is the easiest, but still relatively fast way to encode those frames and send them over the network. The frames come from AcquireNextFrame with a surface that contains the desktop bitmap and metadata which contains dirty and move regions that were updated

Stream video while downloading iOS

拈花ヽ惹草 提交于 2019-12-03 01:32:47
问题 I am using iOS 7 and I have a .mp4 video that I need to download in my app. The video is large (~ 1 GB) which is why it is not included as part of the app. I want the user to be able to start watching the video as soon as is starts downloading. I also want the video to be able to be cached on the iOS device so the user doesn't need to download it again later. Both the normal methods of playing videos (progressive download and live streaming) don't seem to let you cache the video, so I have

Video files in opencv

我只是一个虾纸丫 提交于 2019-12-03 00:50:19
I want to read video file(.avi or .mov) and detect motion and edges using Opencv.Can u help me with code?I want to create a GUI in which we can select the video file,then we can carry out image processing functions in opencv? karlphillip How to read a video file : Read video file and display it on a window (C API) Read video file and display it on a window (C++ API) Read video file, convert it to grayscale then display it on a window (C API) How to track/detect motion : Opencv Motion detection with tracking How to do motion tracking of an object using video? The OpenCV Video Surveillance /

NodeJS, OpenCV and Streaming Images Using Net Socket

那年仲夏 提交于 2019-12-03 00:39:57
My end goal is to stream video from my laptop to a server. I'm trying to accomplish this by using NodeJs on the laptop & the server. I use the OpenCV library to capture the video on the laptop and save it to a jpg file. I then read the file and convert it to base64 so that I can transport it using the Net.socket module in Node. This is a continous process: capture, encode, and send. Here is the server code for just transmitting one jpg file: var cv = require('opencv'); var fs = require('fs'); var net = require('net'); var camera = new cv.VideoCapture(0); var server = net.createServer(); server

What support for live streaming does the HTML5 video element have?

拜拜、爱过 提交于 2019-12-03 00:37:15
Does the HTML5 video element support non-HTTP-based (HLS, SmoothStreaming, etc.) live-streaming protocols? Does it support RTP/RTSP streaming protocols? Does it support RT M P? Are there specific browsers that support or don't support it? Jaruba HTML5 tag has very limited support on video sources. The video sources supported are also limited to what browser your visitors use. Please see: http://www.w3schools.com/html/html5_video.asp for a table of supported formats depending on browser. To sum it up, HTML5 Video supports MP4 on all browsers and OGG, WEBM in FireFox, Opera and Chrome. With that

Possible to stream videos using Amazon S3/CloudFront with HTML5 player?

本秂侑毒 提交于 2019-12-03 00:29:51
问题 I want to use an HTML5 video player and stream videos. Is this possible with S3/CloudFront? I understand Amazon uses the RTMP streaming protocol and HTML5's video tag does not support RTMP. Is there any way to stream videos with HTML5 players? 回答1: Much of what @Wayne Koorts posted provides the basis for a good answer. The disconnect it seems is that you can "stream" video via progressive download. This works with any html5 compatible video file, as he illustrated. In order to get the best

What Techniques Are Best To Live Stream iPhone Video Camera Data To a Computer?

有些话、适合烂在心里 提交于 2019-12-03 00:12:24
问题 I would like to stream video from an iPhone camera to an app running on a Mac. Think sorta like video chat but only one way, from the device to a receiver app (and it's not video chat). My basic understanding so far: You can use AVFoundation to get 'live' video camera data without saving to a file but it is uncompressed data and thus I'd have to handle compression on my own. There's no built in AVCaptureOutput support for sending to a network location, I'd have to work this bit out on my own.

how to run a video clip using asp.net?

烂漫一生 提交于 2019-12-02 23:37:06
问题 I want to embed video player in asp.net which can support multi video formats and can run same format clip in different browsers. i.e mp4/ogg clip can be played on firfox as well as on chrome 回答1: Nothing to do with asp.net (this is pure html stuff), just write out a html video tag html5rocks diveintohtml5 回答2: May I suggest this simple code: <video width="320" height="240" controls> <source src="videoSource.mp4" type="video/mp4"> <source src="movie.ogg" type="video/ogg"> Your browser does

Simulate poor bandwidth in a testing environment (Mac OS X)?

一个人想着一个人 提交于 2019-12-02 22:39:36
We have a customized Flash/HTML5 video player we use for users on our site. I'm currently fleshing out the experience for users who have 'suboptimal' bandwidth--basically we'd like the client side code to be able to detect poor user experience due to excessive buffering. I would like to test this "poor bandwidth" handling code in my local development environment. Does anyone know of good techniques for simulating "poor bandwidth" in a local environment for testing purposes? More specifically I have my local browser connecting to a virtual machine with instances of uWSGI, nginx, and python

Decoding h264 in iOS 8 with video tool box

蓝咒 提交于 2019-12-02 21:15:25
Need to decode h264 stream and get the pixel buffers I know its possible with video tool box on iOS 8 1.How do I convert the h264 stream to CMSampleBufferRef ? 2.How do I use the video tool box to decode? I assume you get the stream in Annex B format, if it is already in AVCC format (read MP4), then you can you the AssetReader and do not need to do much. For an Annex B stream (this is what ppl. often call raw h264 stream). extract SPS/PPS NAL units and create a parameter set from then. You receive them periodically. They contain information for the decode how a frame is supposed to be decoded.