webrtc

How can you do WebRTC over a local network with no internet connection?

£可爱£侵袭症+ 提交于 2020-06-24 12:00:13
问题 I want to have two different computers open a static html page, and be able to communicate to each other via WebRTC over a local area network. There is no internet connection to the outside world in this scenario. One of the pcs would be able to enter the ip address of the other pc manually and connect to it using that hardcoded IP. Is an ICE server necessary? If so, does the server itself need internet access to the outside world? 回答1: You do not need ICE servers in this case. In general,

How to custom WebRTC video source?

断了今生、忘了曾经 提交于 2020-06-23 06:02:06
问题 Does someone know how to change WebRTC (https://cocoapods.org/pods/libjingle_peerconnection) video source? I am working on an screen sharing app. At the moment, I retrieve the rendered frames in real-time in CVPixelBuffer. Does someone know how I could add my frames as video source please? Is it possible to set an other video source instead of camera device source ? Is yes, which format the video has to be and how to do it ? Thanks. 回答1: var connectionFactory : RTCPeerConnectionFactory =

Android org.webrtc.VideoRenderer.I420Frame arrays to PreviewCallback.onPreviewFrame byte[]

穿精又带淫゛_ 提交于 2020-06-22 22:42:11
问题 I keep hoping some code will appear on the internet, but getting nowhere ;) WebRTC incoming I420Frame object seems to have 3 arrays of yuvPlanes A typical Android camera app gets PreviewCallback.onPreviewFrame byte[] as a single array of bytes. Can someone help me in how to convert I420Frames yuvPlanes to a single byte[] array like PreviewCallback.onPreviewFrame byte[] YCbCr_420_SP (NV21)? For reference, VideoStreamsView.java has this code to render to OpenGL - but I just want it like camera

Android org.webrtc.VideoRenderer.I420Frame arrays to PreviewCallback.onPreviewFrame byte[]

橙三吉。 提交于 2020-06-22 22:40:15
问题 I keep hoping some code will appear on the internet, but getting nowhere ;) WebRTC incoming I420Frame object seems to have 3 arrays of yuvPlanes A typical Android camera app gets PreviewCallback.onPreviewFrame byte[] as a single array of bytes. Can someone help me in how to convert I420Frames yuvPlanes to a single byte[] array like PreviewCallback.onPreviewFrame byte[] YCbCr_420_SP (NV21)? For reference, VideoStreamsView.java has this code to render to OpenGL - but I just want it like camera

Android org.webrtc.VideoRenderer.I420Frame arrays to PreviewCallback.onPreviewFrame byte[]

核能气质少年 提交于 2020-06-22 22:38:30
问题 I keep hoping some code will appear on the internet, but getting nowhere ;) WebRTC incoming I420Frame object seems to have 3 arrays of yuvPlanes A typical Android camera app gets PreviewCallback.onPreviewFrame byte[] as a single array of bytes. Can someone help me in how to convert I420Frames yuvPlanes to a single byte[] array like PreviewCallback.onPreviewFrame byte[] YCbCr_420_SP (NV21)? For reference, VideoStreamsView.java has this code to render to OpenGL - but I just want it like camera

Ionic app crash after adding use permission to config.xml file

纵然是瞬间 提交于 2020-06-17 09:13:25
问题 By adding MODIFY_AUDIO_SETTINGS permission to config.xml, my app will crash after run. <edit-config file="AndroidManifest.xml" mode="merge" target="/manifest/uses-permission" xmlns:android="http://schemas.android.com/apk/res/android"> <uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS"/> </edit-config> Then run app: ionic cordova run android -- device Also i try for ionic android permission import { AndroidPermissions } from '@ionic-native/android-permissions/ngx'; this

Ionic app crash after adding use permission to config.xml file

我与影子孤独终老i 提交于 2020-06-17 09:12:47
问题 By adding MODIFY_AUDIO_SETTINGS permission to config.xml, my app will crash after run. <edit-config file="AndroidManifest.xml" mode="merge" target="/manifest/uses-permission" xmlns:android="http://schemas.android.com/apk/res/android"> <uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS"/> </edit-config> Then run app: ionic cordova run android -- device Also i try for ionic android permission import { AndroidPermissions } from '@ionic-native/android-permissions/ngx'; this

is it possible to capture from an element with cross origin data?

不想你离开。 提交于 2020-06-16 09:18:41
问题 i have this simple script that i found in the webRTC doc i triet to run it but it seems i'm missing something const leftVideo = document.getElementById('leftVideo'); const rightVideo = document.getElementById('rightVideo'); leftVideo.addEventListener('canplay', () => { const stream = leftVideo.captureStream(); rightVideo.srcObject = stream; }); i get this error on stream capture when i inspect it Uncaught DOMException: Failed to execute 'captureStream' on 'HTMLMediaElement': Cannot capture

Multi-party WebRTC without SFU

白昼怎懂夜的黑 提交于 2020-06-13 08:01:17
问题 Based on this article, when implementing a WebRTC solution without a server, I assume it means SFU, the bottleneck is that only 4-6 participants can work. Is there a solution that can work around this? For example, I just want to use Firebase as the only backend, mainly signaling and no SFU. What is the general implementation strategy to achieve at least 25-50 participants in WebRTC? Update: This Github project shares a different statement. It states "A full mesh is great for up to ~100

Use WebRTC/GetUserMedia stream as input for FFMPEG

♀尐吖头ヾ 提交于 2020-06-12 06:15:28
问题 I'm recording my screen with gerUserMedia and get my video & audio stream. I'm then using WebRTC to send/receive this stream on another device. Is there any way I can then use this incoming webrtc stream as an input for ffmpeg by converting it somehow? Everything I'm working with is in javascript. Thanks in advance. 回答1: ffmpeg doesn't have WebRTC support (yet) but you have a couple other options that I know of. Pion WebRTC, here is an example of saving video to disk Amazon Kinesis has an