WebRTC iOS Audio Chat

*爱你&永不变心* 提交于 2020-03-22 08:09:11

问题


I am creating a voice only (no video) chat application. I have created my own node.js/socket.io based server for signaling.

For WebRTC, I am using the following pod: https://cocoapods.org/pods/WebRTC

I have been successful in creating peer connection, adding local stream, setting local/remote sdp, and send/receive ice candidates. The "didAddStream" delegate method is also called successfully having audio tracks but I am stuck here. I don't know what should I do with the audio track. What should be the next step? How would I send/receive audio on both sides?

Also, if I integrate CallKit, what changes do I need to make.


回答1:


I got stuck on this one too. You have to retain the RTCMediaStream object in order for the audio to play. You don't need to do anything with the RTCAudioTrack, it will play automatically. I simply assign it to property so it can get retained. See my example here: https://github.com/redfearnk/WebRTCVideoChat/blob/master/WebRTCVideoChat/WebRTCClient.swift#L143



来源:https://stackoverflow.com/questions/46224358/webrtc-ios-audio-chat

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!