mediarecorder-api

Getting the mimeType from a MediaRecorder that wasn't initialized with a mimeType

旧巷老猫 提交于 2019-12-11 06:14:46
问题 I'm using the MediaRecorder API to record some media on a page. In my MediaRecorder initialization, I'm not specifying content type as I do not require anything in particular. The browser can choose what it wants. var mediaRecorder = new MediaRecorder(stream); However, when it comes time to save that recording, I need to know the mimeType for the blob, and so I can determine a reasonable file name extension. The MediaRecorder.mimeType proprety is what I want, but it is an empty string. It

Record the Firefox tab as a video like screencastify on chrome

旧街凉风 提交于 2019-12-06 15:56:35
问题 I would like to record a firefox browser tab through browser extension like Screencastify extension does in chrome. About Recording Session of chrome extension , chrome.tabCapture API is used to get the stream of the currently active tab and to record the stream RecordRTC.js of Web-RTC Experiment is used. Like wise, Is there any API in Mozilla Firefox to get the stream of the tab in Firefox browser. P.S : I am asking about recording the tab of the firefox not recording the screen or window or

Record the Firefox tab as a video like screencastify on chrome

徘徊边缘 提交于 2019-12-04 22:23:20
I would like to record a firefox browser tab through browser extension like Screencastify extension does in chrome. About Recording Session of chrome extension , chrome.tabCapture API is used to get the stream of the currently active tab and to record the stream RecordRTC.js of Web-RTC Experiment is used. Like wise, Is there any API in Mozilla Firefox to get the stream of the tab in Firefox browser. P.S : I am asking about recording the tab of the firefox not recording the screen or window or through cam. There are several privileged apis that let you capture parts of windows or xul elements

Live-Streaming webcam webm stream (using getUserMedia) by recording chunks with MediaRecorder over WEB API with WebSockets and MediaSource

寵の児 提交于 2019-12-04 20:19:52
问题 I'm trying to broadcast a webcam's video to other clients in real-time, but I encounter some problems when viewer's start watching in the middle. For this purpose, I get the webcam's stream using getUserMedia (and all its siblings). Then, on a button click, I start recording the stream and send each segment/chunk/whatever you call it to the broadcaster's websocket's backend: var mediaRecorder = new MediaRecorder(stream); mediaRecorder.start(1000); mediaRecorder.ondataavailable = function

How can I use a MediaRecorder object in an Angular2 application?

跟風遠走 提交于 2019-12-04 02:58:20
问题 I'm building a small Angular2 app and I'm trying to use a MediaRecorder object (https://developer.mozilla.org/en-US/docs/Web/API/MediaRecorder) like so: var mediaRecorder = new MediaRecorder(stream); However, TypeScript is telling me it cannot find name 'MediaRecorder'. I'm guessing this is down to my TypeScript configuration which I pulled directly from the QuickStart guide (https://angular.io/docs/ts/latest/cookbook/visual-studio-2015.html). The configuration looks like this: {

Live-Streaming webcam webm stream (using getUserMedia) by recording chunks with MediaRecorder over WEB API with WebSockets and MediaSource

我们两清 提交于 2019-12-03 13:05:36
I'm trying to broadcast a webcam's video to other clients in real-time, but I encounter some problems when viewer's start watching in the middle. For this purpose, I get the webcam's stream using getUserMedia (and all its siblings). Then, on a button click, I start recording the stream and send each segment/chunk/whatever you call it to the broadcaster's websocket's backend: var mediaRecorder = new MediaRecorder(stream); mediaRecorder.start(1000); mediaRecorder.ondataavailable = function (event) { uploadVideoSegment(event); //wrap with a blob and call socket.send(...) } On the server side (Web

How can I use a MediaRecorder object in an Angular2 application?

雨燕双飞 提交于 2019-12-01 15:36:38
I'm building a small Angular2 app and I'm trying to use a MediaRecorder object ( https://developer.mozilla.org/en-US/docs/Web/API/MediaRecorder ) like so: var mediaRecorder = new MediaRecorder(stream); However, TypeScript is telling me it cannot find name 'MediaRecorder'. I'm guessing this is down to my TypeScript configuration which I pulled directly from the QuickStart guide ( https://angular.io/docs/ts/latest/cookbook/visual-studio-2015.html ). The configuration looks like this: { "compilerOptions": { "target": "es5", "module": "commonjs", "moduleResolution": "node", "sourceMap": true,

How can we mix canvas stream with audio stream using mediaRecorder [duplicate]

血红的双手。 提交于 2019-11-29 14:18:38
This question already has an answer here: MediaStream Capture Canvas and Audio Simultaneously 2 answers I have a canvas stream using canvas.captureStream(). I have another video stream from webrtc video call. Now i want to mix canvas stream with audio tracks of the video stream.How can i do that? Use the MediaStream constructor available in Firefox and Chrome 56, to combine tracks into a new stream: let stream = new MediaStream([videoTrack, audioTrack]); The following works for me in Firefox (Use https fiddle in Chrome, though it errors on recording): navigator.mediaDevices.getUserMedia({audio

Specifying codecs with MediaRecorder

匆匆过客 提交于 2019-11-29 04:34:15
How can I specify the codecs used with the MediaRecorder API ? The only option I see is for mimeType which isn't really sufficient. Cramming in the codecs in the mimeType option doesn't seem to work. var mediaRecorder = new MediaRecorder( outputMediaStream ), { mimeType: 'video/webm; codecs="opus,vp8"' } ); This results in a WebM stream with Vorbis and VP8: FFMPEG STDERR: Input #0, matroska,webm, from 'pipe:': Metadata: encoder : QTmuxingAppLibWebM-0.0.1 Duration: N/A, start: 0.000000, bitrate: N/A Stream #0:0(eng): Video: vp8, yuv420p, 640x360, SAR 1:1 DAR 16:9, 30 fps, 30 tbr, 1k tbn, 1k tbc

How to convert array of png image data into video file

你。 提交于 2019-11-27 09:08:47
I am getting frames from canvas through canvas.getDataURL() . However, now I have an array of png images, but I want a video file. How do I do this? var canvas = document.getElementById("mycanvaselementforvideocapturing"); var pngimages = []; ... setInterval(function(){pngimages.push(canvas.toDataURL())}, 1000); For a full browser support way, you'll have to send your image batch to the server then use some server-side program to do the encoding. FFmpeg might be able to do it. But in newest browsers the canvas.captureStream method, has been implemented. It will convert your canvas drawings to