getusermedia

How to enable camera and microphone in packaged application for Chrome OS or Chrome extension?

≡放荡痞女 提交于 2019-12-05 20:05:41
I'm testing scenario where I call a hangouts web page in separate window but application doesn't have access to microphone and camera - buttons are red and message says that "Hangouts can't use the selected microphone/camera". I have included in permissions "audioCapture" and "videoCapture" . What has to be done to make it work? Edit: After allowing media app has access to camera and microphone - I can see that in settings of hangouts but picture and voice are not transmitted over the hangouts to other participants. Is there something I have to set for streaming media? I already have this

Is it possible now to use GetUserMedia API to read video stream from web camera and send it directly to server for further broadcasting? [closed]

五迷三道 提交于 2019-12-05 09:45:47
I need to build a web application that uses WebRTC to get web camera video stream and mic audio stream and immediatelly translate it to the server for further broadcasting to multiple clients. The app must do it in real time in full duplex. I mean it would be a kind of live video chat. It would be some sort of educational app. So the question is: is it possible now? What technologies should I use? Should I use WebRTC with WebSocket and Node.js on the backend? Or can I use php instead of node? Can I use Socket.io for that? Is there any other ways to achieve this? May be flash? The

getUserMedia is not working in chrome version 48.0.2560.0 while working in 46.0

烈酒焚心 提交于 2019-12-05 09:45:31
my confusion is that i had used the gerUSerMedia() in my page which is not working from our our server while in some of the machine had chrome with version of 48.0.2560.0 in those chrome browser it will give following warning : getUserMedia() no longer works on insecure origins. To use this feature, you should consider switching your application to a secure origin, such as HTTPS. See https://goo.gl/rStTGz for more details. while in with chrome 46.0 its working fine with the same code and with same request which is server from our server. how do we proceed ? so the same thing will working fine

Polyfill file input with accept capture (using getUserMedia to capture?)

放肆的年华 提交于 2019-12-05 08:19:51
I want to enable image (& audio & video) uploads in a survey framework. To do so, input file is nearly sufficient for my purposes. On some mobile browsers <input type="file" accept="image/*;capture=camera"> is a really simple way of letting users choose to upload an existing image or to take a new one. Of course the UI for viewing and choosing among the pictures is provided too. Desktop browsers did not go this route. Instead, some pretty nice stuff seems to be possible using getUserMedia() . I did not find any working examples that upload the collected user-media to the server (e.g. I found

Firefox: drawImage(video) fails with NS_ERROR_NOT_AVAILABLE: Component is not available

淺唱寂寞╮ 提交于 2019-12-05 04:06:31
Trying to call drawImage with a video whose source is a webcam feed seems to fail in Firefox with an NS_ERROR_NOT_AVAILABLE: Component is not available . I have tried to wait for every event the video tag fires: play , playing , canplay , loadeddata , loadedmetadata , and so on, and nothing works. This seems to be because these events are firing before the stream is properly loaded into the <video> element. JSFiddle with error (You can view the error in the console) A side effect is that the width and height of the video is also incorrect. This is a bug in Firefox. The easiest fix is to simply

getUserMedia alternative for iOS

假如想象 提交于 2019-12-04 19:07:21
Does anyone know if there are any (near) future plans to enable getUserMedia for Safari on iOS? Secondarily, does anyone know of any work-arounds to access the camera from a standard mobile website on an iPhone? I saw a post that referenced: <input type="file" accept="image/*" capture="camera"> Can anyone confirm that this really works, and would I use this in lieu of getUserMedia, or would I do a browser/device detect first to determine if I should go the getUserMedia v. capture="camera" route? Well, I'm not sure what you want, but I saw this javascript code that works, I mean it takes the

Audio recording with HTML5

拜拜、爱过 提交于 2019-12-04 15:48:32
I'm trying to implement audio recording in a website. Basically the user should be able to press a button and speak something into the microphone. The recorded audio should then be sent to the server for further processing. I realise that you can do this with Flash, but for now I'm trying to avoid that. I found several resources on the internet about it (i.e. link ) but as it seems, this functionality is not widly supported yet. I experienced differences betweet the used browser and between the used operating system. For instance, the Chrome Browser doesn't seem to access any microphone on

Can I get a live camera image inside my App in Firefox OS?

删除回忆录丶 提交于 2019-12-04 15:41:02
So I am having a look into Firefox OS right now. One thing I would like to try is to manipulate the device camera's live feed using canvas et.al. From what I can see in the blog posts (like this one ) and the code in the boilerplate app this is always done using a MozActivity , meaning that the user is leaving the application, takes a picture and passes this picture back to the application, where I could post-process it. But for live manipulation I would need to have a live camera feed inside my App, just like you would do using getUserMedia when accessing a computer's webcam. getUserMedia

Capturing an image in HTML5 at full resolution

孤人 提交于 2019-12-04 14:37:36
It is possible to capture an image in javascript using the MediaStream API. But in order to do so it is first necessary to instantiate a video object, then paint a frame into a canvas to get an image. But unfortunately many devices (e.g. phones) don't allow you to capture a video at the full native resolution of the device. For instance, on my phone the maximum image resolution is on the order of 4000x3000 but the maximum video resolution is a mere 1920x1080. Obviously capturing an image which is only barely 1/6th of the available resolution is unacceptable. So how can I access the full

How to record audio from Audio Element using javascript

梦想的初衷 提交于 2019-12-04 14:13:06
I am making an audio recorder using HTML5 and Javascript and do not want to include any third party API, I reached at my first step by creating an audio retriever and player using <audio> tag and navigator.webkitGetUserMedia Function which get audio from my microphone and play in through <audio> element but I am not able to get the audio data in an array at this point I don't know what to do which function to use. simple just create a audio node, below is tweaked code from MattDiamond's RecorderJS: function RecordAudio(stream, cfg){ var config = cfg || {}; var bufferLen = config.bufferLen ||