问题
I'm trying to set up a basic example in a Swift playground (code below) but have also tried it with Objective-C with the same result.
import AVFoundation
let fileURL = ... // have tried a wav file and a aiff file
let myAudioFile = AVAudioFile(forReading: fileURL, error: nil)
let myAudioFormat = myAudioFile.fileFormat
let myAudioFrameCount = UInt32(myAudioFile.length)
var myAudioBuffer = AVAudioPCMBuffer(PCMFormat: myAudioFormat, frameCapacity: myAudioFrameCount)
// have also tried a smaller frameCapacity
It works fine up to this point, but then terminates with this:
var myError: NSError?
myAudioFile.readIntoBuffer(myAudioBuffer, error:&myError)
I have also tried naming the argument with buffer:myAudioBuffer but that gives an "extraneous argument" error.
Ultimately, I want to get out raw PCM data from the buffer as a Swift array of floats.
回答1:
The AVAudioPCMBuffer's PCMFormat has to be set as the AVAudioFile's .processingFormat and not its .fileFormat: I thought these were the same, but that's not the case!
来源:https://stackoverflow.com/questions/24088470/getting-avaudiopcmbuffer-working-avaudiofile-mm-error-code-50