iPhone: AudioBufferList init and release

久未见 提交于 2019-11-28 07:44:10

Here is how I do it:

AudioBufferList *
AllocateABL(UInt32 channelsPerFrame, UInt32 bytesPerFrame, bool interleaved, UInt32 capacityFrames)
{
    AudioBufferList *bufferList = NULL;

    UInt32 numBuffers = interleaved ? 1 : channelsPerFrame;
    UInt32 channelsPerBuffer = interleaved ? channelsPerFrame : 1;

    bufferList = static_cast<AudioBufferList *>(calloc(1, offsetof(AudioBufferList, mBuffers) + (sizeof(AudioBuffer) * numBuffers)));

    bufferList->mNumberBuffers = numBuffers;

    for(UInt32 bufferIndex = 0; bufferIndex < bufferList->mNumberBuffers; ++bufferIndex) {
        bufferList->mBuffers[bufferIndex].mData = static_cast<void *>(calloc(capacityFrames, bytesPerFrame));
        bufferList->mBuffers[bufferIndex].mDataByteSize = capacityFrames * bytesPerFrame;
        bufferList->mBuffers[bufferIndex].mNumberChannels = channelsPerBuffer;
    }

    return bufferList;
}

First of all, I think that you actually want 3 AudioBufferLists, not one AudioBufferList with 3 AudioBuffer members. An AudioBuffer represents a single channel of data, so if you have 3 stereo audio files, you should put them in 3 AudioBufferLists, with each list having 2 AudioBuffers, one buffer for the left channel and one for the right. Your code would then process each list (and its respective channel data) separately, and you could store the lists in an NSArray or something like that.

Technically, there's no reason you can't have a single buffer list with 3 interleaved audio channels (meaning that both the left & right channel are stored in a single buffer of data), but this goes against the conventional use of the API and will be a bit confusing.

Anyways, this part of the CoreAudio API is more C-ish than Objective-C-ish, so you'd use malloc/free instead of alloc/release. The code would look something like this:

#define kNumChannels 2
AudioBufferList *bufferList = (AudioBufferList*)malloc(sizeof(AudioBufferList) * kNumChannels);
bufferList->mNumberBuffers = kNumChannels; // 2 for stereo, 1 for mono
for(int i = 0; i < 2; i++) {
  int numSamples = 123456; // Number of sample frames in the buffer
  bufferList->mBuffers[i].mNumberChannels = 1;
  bufferList->mBuffers[i].mDataByteSize = numSamples * sizeof(Float32);
  bufferList->mBuffers[i].mData = (Float32*)malloc(sizeof(Float32) * numSamples);
}

// Do stuff...

for(int i = 0; i < 2; i++) {
  free(bufferList->mBuffers[i].mData);
}
free(bufferList);

The above code is assuming that you are reading in the data as floating point. If you aren't doing any special processing on the files, it's more efficient to read them in as SInt16 (raw PCM data), as the iPhone doesn't have a FPU.

Also, if you aren't using the lists outside of a single method, then it makes more sense to allocate them on the stack instead of the heap by declaring it as a regular object, not a pointer. You still need to malloc() the actual mData member of the AudioBuffer, but at least you don't need to worry about free()'ing the actual AudioBufferList itself.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!