AVAudioEngine playing multi channel audio

前端 未结 2 536
忘掉有多难
忘掉有多难 2020-12-15 14:40

Simple question. How do I play multi channel audio files (>2 channels) using AVAudioEngine so that I can hear all channels on default 2-channel output (headphones/speaker).

相关标签:
2条回答
  • 2020-12-15 15:17

    So here's what I managed so far. It's far from perfect but it somewhat works.

    To get all channels you need to use AVAudioPCMBuffer and store two channels from file in each. Also, for each channel pair you need separate AVAudioPlayerNode, then just connect each player to AVAudioMixerNode and we're done. Some simple code for 6-channel audio:

    AVAudioFormat *outputFormat = [[AVAudioFormat alloc] initWithCommonFormat:AVAudioPCMFormatFloat32 sampleRate:file.processingFormat.sampleRate channels:2 interleaved:false];
    AVAudioFile *file = [[AVAudioFile alloc] initForReading:[[NSBundle mainBundle] URLForResource:@"nums6ch" withExtension:@"wav"] error:nil];
    AVAudioPCMBuffer *wholeBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:file.processingFormat frameCapacity:(AVAudioFrameCount)file.length];
    AVAudioPCMBuffer *buffer1 = [[AVAudioPCMBuffer alloc] initWithPCMFormat:outputFormat frameCapacity:(AVAudioFrameCount)file.length];
    AVAudioPCMBuffer *buffer2 = [[AVAudioPCMBuffer alloc] initWithPCMFormat:outputFormat frameCapacity:(AVAudioFrameCount)file.length];
    AVAudioPCMBuffer *buffer3 = [[AVAudioPCMBuffer alloc] initWithPCMFormat:outputFormat frameCapacity:(AVAudioFrameCount)file.length];
    memcpy(buffer1.audioBufferList->mBuffers[0].mData, wholeBuffer.audioBufferList->mBuffers[0].mData, wholeBuffer.audioBufferList->mBuffers[0].mDataByteSize);
    memcpy(buffer1.audioBufferList->mBuffers[1].mData, wholeBuffer.audioBufferList->mBuffers[1].mData, wholeBuffer.audioBufferList->mBuffers[1].mDataByteSize);
    buffer1.frameLength = wholeBuffer.audioBufferList->mBuffers[0].mDataByteSize/sizeof(UInt32);
    memcpy(buffer2.audioBufferList->mBuffers[0].mData, wholeBuffer.audioBufferList->mBuffers[2].mData, wholeBuffer.audioBufferList->mBuffers[2].mDataByteSize);
    memcpy(buffer2.audioBufferList->mBuffers[1].mData, wholeBuffer.audioBufferList->mBuffers[3].mData, wholeBuffer.audioBufferList->mBuffers[3].mDataByteSize);
    buffer2.frameLength = wholeBuffer.audioBufferList->mBuffers[0].mDataByteSize/sizeof(UInt32);
    memcpy(buffer3.audioBufferList->mBuffers[0].mData, wholeBuffer.audioBufferList->mBuffers[4].mData, wholeBuffer.audioBufferList->mBuffers[4].mDataByteSize);
    memcpy(buffer3.audioBufferList->mBuffers[1].mData, wholeBuffer.audioBufferList->mBuffers[5].mData, wholeBuffer.audioBufferList->mBuffers[5].mDataByteSize);
    buffer3.frameLength = wholeBuffer.audioBufferList->mBuffers[0].mDataByteSize/sizeof(UInt32);
    
    AVAudioEngine *engine = [[AVAudioEngine alloc] init];
    AVAudioPlayerNode *player1 = [[AVAudioPlayerNode alloc] init];
    AVAudioPlayerNode *player2 = [[AVAudioPlayerNode alloc] init];
    AVAudioPlayerNode *player3 = [[AVAudioPlayerNode alloc] init];
    AVAudioMixerNode *mixer = [[AVAudioMixerNode alloc] init];
    [engine attachNode:player1];
    [engine attachNode:player2];
    [engine attachNode:player3];
    [engine attachNode:mixer];
    [engine connect:player1 to:mixer format:outputFormat];
    [engine connect:player2 to:mixer format:outputFormat];
    [engine connect:player3 to:mixer format:outputFormat];
    [engine connect:mixer to:engine.outputNode format:outputFormat];
    [engine startAndReturnError:nil];
    
    [player1 scheduleBuffer:buffer1 completionHandler:nil];
    [player2 scheduleBuffer:buffer2 completionHandler:nil];
    [player3 scheduleBuffer:buffer3 completionHandler:nil];
    [player1 play];
    [player2 play];
    [player3 play];
    

    Now this solution is far from perfect since there will be a delay between pairs of channels because of calling play for each player at different time. I also still can't play 8-channel audio from my test files (see link in OP). The AVAudioFile processing format has 0 for channel count and even if I create my own format with correct number of channels and layout, I get error on buffer read. Note that I can play this file perfectly fine using AUGraph.

    So I will wait before accepting this answer, if you have better solution please share.

    EDIT

    So it appears that both my unable to sync nodes problem and not being able to play this particular 8-channel audio are bugs (confirmed by Apple developer support).

    So little advice for people meddling with audio on iOS. While AVAudioEngine is fine for simple stuff, you should definitely go for AUGraph with more complicated stuff, even stuff that's suppose to work with AVAudioEngine. And if you don't know how to replicate certain things from AVAudioEngine in AUGraph (like myself), well, tough luck.

    0 讨论(0)
  • 2020-12-15 15:26

    I had a similar issue in Swift. The error was 'com.apple.coreaudio.avfaudio', reason: 'required condition is false:!nodeimpl->SslEngineImpl()'.

    The task was to play two audio files, one after another. If I hit stop after playing the first audio file and then played the second audio file, the system crashed.

    I found that in a function I created I had audioEngine.attachNode(audioPlayerNode) which means that the audioPlayerNode was being attached to the audioEngine once and then exited. So I moved this attachment to the viewDidLoad() function so that it gets passed every time.

    0 讨论(0)
提交回复
热议问题