Generating a tone in iOS with 16 bit PCM, AudioEngine.connect() throws AUSetFormat: error -10868

这一生的挚爱 提交于 2019-12-03 21:34:30

The problem was that when falling out of the play() function, the player was getting cleaned up and never completed (or barely started) playing. Here's one fairly clumsy solution to that: sleep for as long as the sample before returning from play().

I'll accept a better answer that avoids having to do this by not having the player cleaned up if anyone wants to post one.

import AVFoundation

class AudioManager: NSObject, AVAudioPlayerDelegate {

    let audioPlayerNode = AVAudioPlayerNode()

    var waveAudioPlayer: AVAudioPlayer?

    var playing: Bool! = false

    lazy var audioEngine: AVAudioEngine = {
        let engine = AVAudioEngine()

        // Must happen only once.
        engine.attachNode(self.audioPlayerNode)

        return engine
    }()

    func playWaveFromBundle(filename: String, durationInSeconds: NSTimeInterval) -> Void {
        var error: NSError?
        var sound = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource(filename, ofType: "wav")!)

        if error != nil {
            log.error("Error: \(error)")
            return
        }

        self.waveAudioPlayer = AVAudioPlayer(contentsOfURL: sound, error: &error)
        self.waveAudioPlayer!.delegate = self

        AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, error: &error)

        if error != nil {
            log.error("Error: \(error)")
            return
        }

        log.verbose("Playing \(sound)")

        self.waveAudioPlayer!.prepareToPlay()

        playing = true

        if !self.waveAudioPlayer!.play() {
            log.error("Failed to play")
        }

        // If we don't block here, the player stops as soon as this function returns. While we'd prefer to wait for audioPlayerDidFinishPlaying() to be called here, it's never called if we block here. Instead, pass in the duration of the wave file and simply sleep for that long.
        /*
        while (playing!) {
            NSThread.sleepForTimeInterval(0.1) // seconds
        }
        */

        NSThread.sleepForTimeInterval(durationInSeconds)

        log.verbose("Done")
    }

    func play(frequency: Int, durationInMillis: Int, completionBlock:dispatch_block_t!) -> Void {
        var session = AVAudioSession.sharedInstance()
        var error: NSError?

        if !session.setCategory(AVAudioSessionCategoryPlayAndRecord, error: &error) {
            log.error("Error: \(error)")
            return
        }

        var mixer = audioEngine.mainMixerNode
        var sampleRateHz: Float = Float(mixer.outputFormatForBus(0).sampleRate)
        var numberOfSamples = AVAudioFrameCount((Float(durationInMillis) / 1000 * sampleRateHz))

        var format = AVAudioFormat(commonFormat: AVAudioCommonFormat.PCMFormatFloat32, sampleRate: Double(sampleRateHz), channels: AVAudioChannelCount(1), interleaved: false)

        var buffer = AVAudioPCMBuffer(PCMFormat: format, frameCapacity: numberOfSamples)
        buffer.frameLength = numberOfSamples

        // Generate sine wave
        for var i = 0; i < Int(buffer.frameLength); i++ {
            var val = sinf(Float(frequency) * Float(i) * 2 * Float(M_PI) / sampleRateHz)

            // log.debug("val: \(val)")

            buffer.floatChannelData.memory[i] = val * 0.5
        }

        AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, error: &error)

        if error != nil {
            log.error("Error: \(error)")
            return
        }

        // Audio engine
        audioEngine.connect(audioPlayerNode, to: mixer, format: format)

        log.debug("Sample rate: \(sampleRateHz), samples: \(numberOfSamples), format: \(format)")

        if !audioEngine.startAndReturnError(&error) {
            log.error("Error: \(error)")
            return
        }

        // TODO: Check we're not in the background. Attempting to play audio while in the background throws:
        //   *** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'error 561015905'

        // Play player and schedule buffer
        audioPlayerNode.play()
        audioPlayerNode.scheduleBuffer(buffer, atTime: nil, options: nil, completionHandler: completionBlock)

        // If we don't block here, the player stops as soon as this function returns.
        NSThread.sleepForTimeInterval(Double(durationInMillis) * 1000.0) // seconds
    }

    // MARK: AVAudioPlayerDelegate

    func audioPlayerDidFinishPlaying(player: AVAudioPlayer!, successfully flag: Bool) {
        log.verbose("Success: \(flag)")

        playing = false
    }

    func audioPlayerDecodeErrorDidOccur(player: AVAudioPlayer!, error: NSError!) {
        log.verbose("Error: \(error)")

        playing = false
    }

    // MARK: NSObject overrides

    deinit {
        log.verbose("deinit")
    }

}

For context, this AudioManager is a lazy loaded property on my AppDelegate:

lazy var audioManager: AudioManager = {
        return AudioManager()
    }()

Try setting your session category to "AVAudioSessionCategoryPlay or AVAudioSessionCategoryPlayAndRecord." I'm using record and playback and calling it before the recording seems to work out fine. I'm guessing it has to go before you start connecting nodes.

        var session = AVAudioSession.sharedInstance()
    session.setCategory(AVAudioSessionCategoryPlayAndRecord, error: &error)

Regarding the issue of not getting sound, even when using PCMFormatFloat32:

I've wrestled with the same issue for a few days now and finally found the (or at least one) problem: you need to manually set the frameLength of the PCM Buffer:

pcmBuffer.frameLength = AVAudioFrameCount(sound.count/2)

The division by two account for the two bytes per frame (16 bit encoded in two bytes).

Besides that, another change that I made, and which I don't yet know whether it matters or not is that I made the AVAudioEngine and the AVAudioPlayerNode members of the class, so as to avoid them being destroyed before playback ends.

I have been encountering the same behaviour like you, that means I was helping myself with NSThread.sleepForTimeInterval(). Right now I figured out the solution, which works for me. The point is, that the AudioEngine() object needs to be initialised out of the function Play(). It has to be initialised on the class level, so the engine can work and play the sound even after the function quits (which is immediately). Right after I moved the line initialising the AudioEngine, the sound can be heard even without the waiting "helper". Hope it will help you.

To get the right number of samples(numberOfSamples): mixer.outputFormatForBus(0).sampleRate gives back 44100.0 Multiply by 1000 is not necessary in the second example.

For me first call play() und afterwards set scheduleBuffer on playernode seems not logical. I would do revers.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!