getusermedia

getUserMedia() in PWA on iOS 11.3.1

此生再无相见时 提交于 2019-12-23 19:32:03
问题 I'm trying to build a PWA of my webapp. My webapp makes use of getUserMedia to let the user make a picture. the webapp works as expected. the browser asks permission to access the camera and if a user accepts it, the app continues to work. Now, I made a PWA which works except for the camera. The user doesn't get a prompt to give access to its camera which is where I think is the problem. Is there any way to trigger the camera access: let constraints = { video: { facingMode: "user" }, audio:

How can I prevent breakup/choppiness/glitches when using an AudioWorklet to stream captured audio?

可紊 提交于 2019-12-23 15:44:50
问题 We've been working on a JavaScript-based audio chat client that runs in the browser and sends audio samples to a server via a WebSocket. We previously tried using the Web Audio API's ScriptProcessorNode to obtain the sample values. This worked well on our desktops and laptops, but we experienced poor audio quality when transmitting from a handheld platform we must support. We've attributed this to the documented script processor performance issues (https://developer.mozilla.org/en-US/docs/Web

Detect Firefox support for screen sharing

我的未来我决定 提交于 2019-12-23 13:10:54
问题 Firefox, since version 52, will support screen sharing via: navigator.mediaDevices.getUserMedia({ video: { mediaSource: 'screen' }}) .then(stream => { ... }); Check out this test page to see it in action. I would like to know whether there is a way to detect whether a browser supports { mediaSource: 'screen' } ? I would like to only give the option to share the screen with users that have the ability to share. So I'd like to be able to feature detect this. 回答1: a way to detect whether a

WebRTC continue video stream when webcam is reconnected

生来就可爱ヽ(ⅴ<●) 提交于 2019-12-23 12:38:20
问题 I've got simple video stream working via getUserMedia, but I would like to handle case when webCam what i'm streaming from becomes disconnected or unavailable. So I've found oninactive event on stream object passed to successCallback function. Also I would like to restart video stream when exactly same webcam/mediaDevice will be plugged in. Code example: navigator.getUserMedia = navigator.getUserMedia || navigator.webkitGetUserMedia || navigator.mozGetUserMedia; navigator.getUserMedia

Chrome: onaudioprocess stops getting called after a while

£可爱£侵袭症+ 提交于 2019-12-23 08:49:12
问题 I'm using ScriptProcessorNode's onaudioprocess callback to process the microphone input. By connecting MediaStreamSourceNode to the ScriptProcessorNode, I can get the raw audio data within the onaudioprocess callback function. However, after about 30 seconds (this varies ranging from 10 to 35 sec,) the browser stops calling onaudioprocess. In the following code, the console.log output ('>>') always stops after about 30 sec. var ctx = new AudioContext(); var BUFFER_LENGTH = 4096; console.log(

getUserMedia() video size in Firefox & Chrome differs

自古美人都是妖i 提交于 2019-12-23 07:27:11
问题 I'm using getUserMedia() , and when implementing constraints (see below) they only work in Chrome and not Mozilla. The size in mozilla always appears stretched and ends up bigger than the one in chome. var vid_constraints = { mandatory: { maxHeight: 180, maxWidth: 320 } } var constraints = { audio: false, video: vid_constraints }; navigator.getUserMedia(constraints, successCallback, errorCallback); After reading some, it appears that MozGetUserMedia() doesn't support resolution constraints.

JS Audio - audioBuffer getChannelData to frequency

烈酒焚心 提交于 2019-12-23 05:23:21
问题 bsd I am trying achieve pitch detection, and moreover learn some basic audio physics on the way, I am actually really new to this and just trying to understand how this whole thing works... My question is, What is exactly the audioBuffer and how is the data coming from getChannelData related to frequencies. and how can I extract frequency data from the audioBuffer... Also, if someone can explain just a bit about sample rates etc. also this would be great. Thanks! 回答1: An AudioBuffer simply

MediaSource randomly stops video

耗尽温柔 提交于 2019-12-22 06:29:36
问题 I am working on a project where I want to getUserMedia -> MediaRecorder -> socketIO -> MediaSource appendbuffer I got it to work, however after a few seconds it randomly stops. I know about WebRTC, but in the project I am working on it's based on an environment which is a version of Chrome embedded that doesn't support WebRTC. Server: 'use strict'; const io = require('socket.io')(); io.on('connection', (socket) => { console.log('connection'); socket.on('stream', (data) => { socket.emit(

Capturing an image in HTML5 at full resolution

纵然是瞬间 提交于 2019-12-21 20:55:01
问题 It is possible to capture an image in javascript using the MediaStream API. But in order to do so it is first necessary to instantiate a video object, then paint a frame into a canvas to get an image. But unfortunately many devices (e.g. phones) don't allow you to capture a video at the full native resolution of the device. For instance, on my phone the maximum image resolution is on the order of 4000x3000 but the maximum video resolution is a mere 1920x1080. Obviously capturing an image

Get media details(resolution and frame rate) from MediaStream object

大兔子大兔子 提交于 2019-12-21 03:48:43
问题 I am capturing the user's camera, i want to catch the picture with the best resolution possible, so my code is something like the snippet below, I want to read the resolution details from the incoming stream, so i can set it as video height and width, which I ll use to click snapshot, I want the snapshot to be of best quality offered by the stream, is this possible( to read resolution details from stream variable) ? EDIT : I am transmitting the video using webrtc so I would also like to find