问题
I have been trying for a while now how to stream real time audio from Data provided from a URLSessionDataTask in iOS.
I have declared a custom class for managing the player actions and it looks like this:
import UIKit
import AVFoundation
class AudioDataPlayer: NSObject {
//MARK:- Variables
//MARK: Constants
enum Status{
case playing
case notPlaying
}
let audioPlayerQueue = DispatchQueue(label: "audioPlayerQueue", qos: DispatchQoS.userInteractive)
//MARK: Vars
private (set) var currentStatus:Status = .notPlaying
private var audioEngine: AVAudioEngine = AVAudioEngine()
private var streamingAudioPlayerNode: AVAudioPlayerNode = AVAudioPlayerNode()
private (set) var streamingAudioFormat: AVAudioFormat = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 48000, channels: 2, interleaved: false)!
//MARK:- Constructor
override init() {
super.init()
}
//MARK:- Private methods
//MARK:- Public methods
func processData(_ data:Data) throws{
if currentStatus == .notPlaying{
do{
try AVAudioSession.sharedInstance().setCategory(.playAndRecord, mode: .default, options: [.allowAirPlay])
try AVAudioSession.sharedInstance().setActive(true)
if #available(iOS 11.0, *) {
try audioEngine.enableManualRenderingMode(.realtime, format: streamingAudioFormat, maximumFrameCount: 3072)
}
audioEngine.attach(streamingAudioPlayerNode)
audioEngine.connect(streamingAudioPlayerNode, to: audioEngine.mainMixerNode, format: streamingAudioFormat)
currentStatus = .playing
}
catch{
print("\(logClassName) ERROR -> \(error.localizedDescription)")
}
}
audioPlayerQueue.async {
if let audioPCMBuffer = data.makePCMBuffer(format: self.streamingAudioFormat){
self.streamingAudioPlayerNode.scheduleBuffer(audioPCMBuffer, completionHandler: {
//TODO
})
if !self.audioEngine.isRunning{
try! self.audioEngine.start()
self.streamingAudioPlayerNode.play()
}
}
else{
print("\(self.logClassName) TEST -> Ignoring data to play ...")
}
}
}
func stop(){
audioEngine.stop()
audioEngine.detach(streamingAudioPlayerNode)
currentStatus = .notPlaying
}
}
The function that manages the incoming data is 'processData(_ data:Data)' and it is called like this from another class:
let processingQueue = DispatchQueue(label: "processingQueue", qos: DispatchQoS.userInteractive)
var audioDataPlayer:AudioDataPlayer = AudioDataPlayer()
func urlSession(_ session: URLSession, dataTask: URLSessionDataTask, didReceive data: Data) {
processingQueue.async {
try! self.audioDataPlayer.processData(data)
}
}
I have got the code from forums and the apple documanetation website. However, maybe I still quite don't understand how it works and there is no sound comming from the device...
The audio data is 48K, 16bit and 2 channels format.
Any ideas?
回答1:
If your audio data is 16bit (integer, one would assume), you need to initialize the AVAudioFormat
with pcmFormatInt16
instead of pcmFormatFloat32
.
And being non interleaved seems a bit odd for this format, so you may have to set interleaved
to true
.
来源:https://stackoverflow.com/questions/54670067/avaudioengine-no-sound