webrtc

WebRTC datachannel with manual signaling, example please?

|▌冷眼眸甩不掉的悲伤 提交于 2020-06-11 07:53:26
问题 I'm really struggling to get a complete example of a WebRTC datachannel example that I can copy/paste and it works. I would like a Javascript example of WebRTC datachannel with manual signaling i.e. When the example loads, it provides the Signaling data in one text box. I copy data manually (highlight, copy) and paste it in the peer's window which has a text box to accept that signaling data. I believe there needs to be an "answer" in the signaling data, so there need to be corresponding text

How to enable H264 on Android webRTC

只愿长相守 提交于 2020-06-11 06:05:47
问题 How to enable H264 on Android WebRTC. PeerConnection to createOffer there was no h264 description in SDP. 回答1: Google's current WebRTC implementation only supports hardware H.264 decoding and encoding on Android, and with select chipsets only. So if a particular device doesn't have hardware H.264 support or has an unsupported chipset, you'll only get VP8/VP9. 回答2: H.264 works with WebRTC in Chrome on Android M57. 来源: https://stackoverflow.com/questions/36766716/how-to-enable-h264-on-android

How to get the audio and video from a WebRTC stream using ffmpeg on server

被刻印的时光 ゝ 提交于 2020-06-10 04:32:06
问题 I am trying to get the audio and video from a WebRTC stream and handle it (transcode or dump) with ffmpeg on ubuntu server. I have naively expected it to simply interpret the sdp offered by WebRTC, but was mistaken. I suspect ffmpeg is not capable of signaling back the answer sdp and it is must be done manually. Here is an offer sdp: v=0 o=Mozilla-SIPUA-33.1 3662 0 IN IP4 0.0.0.0 s=SIP Call t=0 0 a=ice-ufrag:5e0a74d1 a=ice-pwd:7446c0eb445117d0018ca2afc5f3ca54 a=fingerprint:sha-256 76:1B:19:CE

Change playout delay in WebRTC stream

元气小坏坏 提交于 2020-06-10 03:27:30
问题 I'm trying to cast a live MediaStream (Eventually from the camera) from peerA to peerB and I want peerB to receive the live stream in real time and then replay it with an added delay. Unfortunately in isn't possible to simply pause the stream and resume with play since it jump forward to the live moment. So I have figured out that I can use MediaRecorder + SourceBuffer rewatch the live stream. Record the stream and append the buffers to MSE (SourceBuffer) and play it 5 seconds later. This

Change playout delay in WebRTC stream

百般思念 提交于 2020-06-10 03:25:06
问题 I'm trying to cast a live MediaStream (Eventually from the camera) from peerA to peerB and I want peerB to receive the live stream in real time and then replay it with an added delay. Unfortunately in isn't possible to simply pause the stream and resume with play since it jump forward to the live moment. So I have figured out that I can use MediaRecorder + SourceBuffer rewatch the live stream. Record the stream and append the buffers to MSE (SourceBuffer) and play it 5 seconds later. This

Convert PCM wave data to numpy arrays and vice versa

浪尽此生 提交于 2020-06-09 18:32:12
问题 The situation I am using VAD (Voice Activity Detection) from WebRTC by using WebRTC-VAD, a Python adapter. The example implementation from the GitHub repo uses Python's wave module to read PCM data from files. Note that according to the comments the module only works with mono audio and a sampling rate of either 8000, 16000 or 32000 Hz. What I want to do Read audio data from arbitrary audio files (MP3 and WAV files) with different sampling rates, convert them into the PCM-representation that

How do you get around NATs using WebRTC without a TURN server?

两盒软妹~` 提交于 2020-05-28 04:48:39
问题 I'm trying to make a peer to peer Javascript game that can be played on mobile browsers. I have been able to successfully set up a p2p connection between two phones within my local WiFi network. I am unable to connect two phones over mobile networks or one on WiFi and one on a mobile network. I tried turning off my Windows firewall and could not connect my PC to my phone on a mobile network. I tried having both peers set up their own data channels and set negotiated. I've read that 80% to 90%

How do you get around NATs using WebRTC without a TURN server?

南楼画角 提交于 2020-05-28 04:48:30
问题 I'm trying to make a peer to peer Javascript game that can be played on mobile browsers. I have been able to successfully set up a p2p connection between two phones within my local WiFi network. I am unable to connect two phones over mobile networks or one on WiFi and one on a mobile network. I tried turning off my Windows firewall and could not connect my PC to my phone on a mobile network. I tried having both peers set up their own data channels and set negotiated. I've read that 80% to 90%

how to enable h264 in peerconnection?

情到浓时终转凉″ 提交于 2020-05-27 03:58:49
问题 many media says firefox support h264 in webrtc, but I can't find any information. How to enable h264 in webrtc? This is my mediaConstraints var mediaConstraints = { video: { mandatory: { maxWidth: 640, maxHeight: 480 } }, audio: true }; 回答1: It would seem that H264 is not supported by default yet. You will have to add the codec line manually into the SDP before setting it as local and sending the offer. It does look like they are hard at work to get it out soon. You can see this with the work

Flutter - webRTC Video Call signalling doesn't work

半城伤御伤魂 提交于 2020-05-17 08:46:41
问题 I am able to implement voice and video call using agora.io library which is available at https://www.agora.io/ && https://github.com/AgoraIO/Flutter-SDK how ever the process for starting a call is both user has to join a particular channel name defined by the user manually or automatically. which is not the practical way. Is there any way to create a separate signalling system (may be using, nodejs socket, firebase or one-signal notification? ) What's the simultaneous/parallel way to be used