AVPlayer playback of single channel audio stereo->mono

生来就可爱ヽ(ⅴ<●) 提交于 2019-11-29 07:54:20

I now finally found an answer to this question - at least for deployment on iOS 6. You can easily add an MTAudioProcessingTap to your existing AVPlayer item and copy the selected channels samples to the other channel during your process callback function. Here is a great tutorial explaining the basics: http://chritto.wordpress.com/2013/01/07/processing-avplayers-audio-with-mtaudioprocessingtap/

This is my code so far, mostly copied from the link above.

During AVPlayer setup I assign callback functions for audio processing:

MTAudioProcessingTapCallbacks callbacks;
callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
callbacks.clientInfo = ( void *)(self);
callbacks.init = init;
callbacks.prepare = prepare;
callbacks.process = process;
callbacks.unprepare = unprepare;
callbacks.finalize = finalize;

MTAudioProcessingTapRef tap;
// The create function makes a copy of our callbacks struct
OSStatus err = MTAudioProcessingTapCreate(kCFAllocatorDefault, &callbacks,
                                          kMTAudioProcessingTapCreationFlag_PostEffects, &tap);
if (err || !tap) {
    NSLog(@"Unable to create the Audio Processing Tap");
    return;
}
assert(tap);

// Assign the tap to the input parameters
audioInputParam.audioTapProcessor = tap;

// Create a new AVAudioMix and assign it to our AVPlayerItem
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = @[audioInputParam];
playerItem.audioMix = audioMix;

Here are the audio processing functions (actually process is the only one needed):

#pragma mark Audio Processing

void init(MTAudioProcessingTapRef tap, void *clientInfo, void **tapStorageOut) {
    NSLog(@"Initialising the Audio Tap Processor");
    *tapStorageOut = clientInfo;
}

void finalize(MTAudioProcessingTapRef tap) {
    NSLog(@"Finalizing the Audio Tap Processor");
}

void prepare(MTAudioProcessingTapRef tap, CMItemCount maxFrames, const AudioStreamBasicDescription *processingFormat) {
    NSLog(@"Preparing the Audio Tap Processor");
}

void unprepare(MTAudioProcessingTapRef tap) {
    NSLog(@"Unpreparing the Audio Tap Processor");
}

void process(MTAudioProcessingTapRef tap, CMItemCount numberFrames,
         MTAudioProcessingTapFlags flags, AudioBufferList *bufferListInOut,
         CMItemCount *numberFramesOut, MTAudioProcessingTapFlags *flagsOut) {
    OSStatus err = MTAudioProcessingTapGetSourceAudio(tap, numberFrames, bufferListInOut, flagsOut, NULL, numberFramesOut);
    if (err) NSLog(@"Error from GetSourceAudio: %ld", err);

    SIVSViewController* self = (SIVSViewController*) MTAudioProcessingTapGetStorage(tap);

    if (self.selectedChannel) {

        int channel = self.selectedChannel;

        if (channel == 0) {
            bufferListInOut->mBuffers[1].mData = bufferListInOut->mBuffers[0].mData;
        } else {
            bufferListInOut->mBuffers[0].mData = bufferListInOut->mBuffers[1].mData;
        }
    }
}
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!