web-mediarecorder

Combine audio and video streams into one file with MediaRecorder [duplicate]

和自甴很熟 提交于 2021-02-18 06:24:47
问题 This question already has answers here : MediaStream Capture Canvas and Audio Simultaneously (2 answers) Closed 2 years ago . I am making a small interactive animation/game (on canvas with PixiJS) and wish to give users an option to save the rendered animation. After doing my research, MediaRecorder appears to be the API that I should use to record and render the video. However the MediaRecorder constructor only allows one stream to be used as source. How can I merge additional streams (audio

MediaRecorder switch video tracks

你说的曾经没有我的故事 提交于 2020-07-09 10:18:31
问题 I am using MediaRecorder API to record videos in web applications. The application has the option to switch between the camera and screen. I am using Canvas to augment stream recording. The logic involves capturing stream from the camera and redirecting it to the video element. This video is then rendered on canvas and the stream from canvas is passed to MediaRecorder . What I noticed is that switching from screen to video (and vice-versa) works fine as long as the user doesn't switch

PCM support in WebM, and Chrome's WebM implementation

家住魔仙堡 提交于 2020-07-03 07:30:25
问题 Does WebM support PCM for the audio codec? I didn't think it did, but I see on the WebM documentation page that there is support for a BitDepth field with the following comment: BitDepth - Bits per sample, mostly used for PCM. If WebM does support PCM, does Chrome's implementation? And if it does, what is the appropriate content type for use with MediaRecorder? These all return false: MediaRecorder.isTypeSupported('video/webm;codecs=h264,pcm'); MediaRecorder.isTypeSupported('video/webm;codecs

How to play WEBM files individually which are created by MediaRecorder

霸气de小男生 提交于 2020-06-22 03:36:50
问题 For recording audio and video, I am creating webm files under the ondataavailable of MediaRecorder API. I have to play each created webm file individually. Mediarecorder api inserts header information into first chunk (webm file) only, so rest of the chunks do not play individually without the header information. As suggested link 1 and link 2, I have extracted the header information from first chunk, // for the most regular webm files, the header information exists // between 0 to 189 Uint8

How to play WEBM files individually which are created by MediaRecorder

天涯浪子 提交于 2020-06-22 03:34:25
问题 For recording audio and video, I am creating webm files under the ondataavailable of MediaRecorder API. I have to play each created webm file individually. Mediarecorder api inserts header information into first chunk (webm file) only, so rest of the chunks do not play individually without the header information. As suggested link 1 and link 2, I have extracted the header information from first chunk, // for the most regular webm files, the header information exists // between 0 to 189 Uint8

Change playout delay in WebRTC stream

元气小坏坏 提交于 2020-06-10 03:27:30
问题 I'm trying to cast a live MediaStream (Eventually from the camera) from peerA to peerB and I want peerB to receive the live stream in real time and then replay it with an added delay. Unfortunately in isn't possible to simply pause the stream and resume with play since it jump forward to the live moment. So I have figured out that I can use MediaRecorder + SourceBuffer rewatch the live stream. Record the stream and append the buffers to MSE (SourceBuffer) and play it 5 seconds later. This

Change playout delay in WebRTC stream

百般思念 提交于 2020-06-10 03:25:06
问题 I'm trying to cast a live MediaStream (Eventually from the camera) from peerA to peerB and I want peerB to receive the live stream in real time and then replay it with an added delay. Unfortunately in isn't possible to simply pause the stream and resume with play since it jump forward to the live moment. So I have figured out that I can use MediaRecorder + SourceBuffer rewatch the live stream. Record the stream and append the buffers to MSE (SourceBuffer) and play it 5 seconds later. This

MediaRecorder API simple example / “hello world”

时间秒杀一切 提交于 2019-12-25 14:58:09
问题 Here's a simple example for the MediaRecorder API: (async function() { let chunks = []; let stream = await navigator.mediaDevices.getUserMedia({ audio:true, video:false }); let mediaRecorder = new MediaRecorder(stream); // record for 3 seconds: mediaRecorder.start(); setTimeout(() => { mediaRecorder.stop(); }, 3000) mediaRecorder.ondataavailable = function(e) { chunks.push(e.data); }; mediaRecorder.onstop = async function() { let blob = new Blob(chunks, { type: mediaRecorder.mimeType }); let

Playing webm chunks as standalone video

情到浓时终转凉″ 提交于 2019-12-22 11:02:32
问题 I've built some code that will get the MediaRecorder API to capture audio and video, and then use the ondataavailable function to send the corresponding webm file blobs up to a server via websockets. The server then sends those blobs to a client via websockets which puts the video together in a buffer using the Media Source Extension API. This works well, except that if I want to start a stream partway through, I can't just send the latest blob because the blob by itself is unplayable. Also,