getusermedia

NotReadableError: Could not start source

半世苍凉 提交于 2019-11-26 20:26:57
问题 I have added this piece of code in my project if (navigator.mediaDevices === undefined) { navigator.mediaDevices = {}; } if (navigator.mediaDevices.getUserMedia === undefined) { navigator.mediaDevices.getUserMedia = function (constraints) { var getUserMedia = ( navigator.getUserMedia || navigator.webkitGetUserMedia || navigator.mozGetUserMedia ); if (!getUserMedia) { return Promise.reject(new Error('getUserMedia is not implemented in this browser')); } return new Promise(function (resolve,

What constraints should I pass to getUserMedia() in order to get two video mediaStreamTracks?

不羁的心 提交于 2019-11-26 18:30:28
问题 I can get mediaDevices of 'videoinput' kind via navigator.mediaDevices.enumerateDevices() promise. I can get mediaStream via navigator.mediaDevices.getUserMedia(constraints) promise. What should constraints look like in order to have two video tracks in userMedia? 回答1: You can get max one video track and one audio track each time you call getUserMedia() , but you can call it multiple times. This may ask the user more than once though, depending on https, browser, and what the user does.

GetUserMedia - facingmode

不问归期 提交于 2019-11-26 15:27:14
I am currently using an Android tablet and GetUserMedia to take pictures in my program. Apparently, the default camera used by GetUserMedia is the front camera. How do I use the rear camera as a default? Here's my code for GetUserMedia: navigator.getUserMedia({ "audio": false, "video": { mandatory: { minWidth: this.params.dest_width, minHeight: this.params.dest_height, //facingMode: "environment", }, } }, function(stream) { // got access, attach stream to video video.src = window.URL.createObjectURL( stream ) || stream; Webcam.stream = stream; Webcam.loaded = true; Webcam.live = true; Webcam

Sending a MediaStream to host Server with WebRTC after it is captured by getUserMedia

三世轮回 提交于 2019-11-26 12:38:39
问题 I am capturing audio data using getUserMedia() and I want to send it to my server so I can save it as a Blob in a MySQL field. This is all I am trying to do. I have made several attempts to do this using WebRTC, but I don\'t even know at this point if this is right or even the best way to do this. Can anybody help me? Here is the code I am using to capture audio from the microphone: navigator.getUserMedia({ video:false, audio:true, },function(mediaStream){ // output mediaStream to speakers:

HTML5 getUserMedia record webcam, both audio and video

北城以北 提交于 2019-11-26 12:09:08
问题 Is it possible to use Chrome to capture video (webcam) and audio (microphone) from the browser and then save the stream as video file? I would like to use this to create a video/photobooth-like application that allows users to record a simple (30 second) message (both video and audio) to files that can later be watched. I have read the documentation but I have not (yet) seen any examples on how to capture both audio & video, also I did not find a way yet to store the results in a video file.

GetUserMedia - facingmode

只愿长相守 提交于 2019-11-26 05:59:04
问题 I am currently using an Android tablet and GetUserMedia to take pictures in my program. Apparently, the default camera used by GetUserMedia is the front camera. How do I use the rear camera as a default? Here\'s my code for GetUserMedia: navigator.getUserMedia({ \"audio\": false, \"video\": { mandatory: { minWidth: this.params.dest_width, minHeight: this.params.dest_height, //facingMode: \"environment\", }, } }, function(stream) { // got access, attach stream to video video.src = window.URL