Pure js stream from webcamera to server

半世苍凉 提交于 2020-06-11 05:26:49

问题


Is it possible to capture stream from webcamera(in front end) and stream it to server via hls or rtmp with pure js(no flash).
And if there are another protocol which let send stream as stream(unlike hls), will be prefered.


回答1:


I found solution. There is no (yet) any way to "convert" stream received from navigator.getUserMedia() to rtmp in front-end. But we can use MediaRecorder Api.
In client Side

const stream = await navigator.getUserMedia(options)
const recorder = new MediaRecorder(stream)
recorder.ondataavailable = (e) => { socket.emit('binaryData',e.data) }
recorder(start)

In backend

const ffmpegProcess = spawn('ffmpeg', ffmpegCommans)
socket.on('binaryData', (data) => { 
  ffmpegProcess.stdin.write(params.data)
})

FFmpeg will convert vp8 video stream to hls/rtmp/rtsp or whatever.

In this way we can get video stream with latency 3(average) second .




回答2:


yes, you can do that.

You can directly access the camera by using an API in the WebRTC specification called getUserMedia(). getUserMedia() will prompt the user for access to their connected microphones and cameras.

If successful the API will return a Stream that will contain the data from either the camera or the microphone, and we can then either attach it to a element, attach it to a WebRTC stream, or save it using the MediaRecorder API.

To get data from the camera we just set video: true in the constraints object that is passed to the getUserMedia() API.

    <video id="player" controls height="400" width="400"/video>

<script>
  var player = document.getElementById("player");

  var handleSuccess = function(stream) {
    player.srcObject = stream;
  };

  navigator.mediaDevices
    .getUserMedia({ audio: true, video: true })
    .then(handleSuccess);
</script>

I quote the answer form this link



来源:https://stackoverflow.com/questions/59322587/pure-js-stream-from-webcamera-to-server

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!