macOS/swift Capture Audio with AVCaptureSession

江枫思渺然 提交于 2021-02-07 07:05:22

问题


I am currently trying to implement a simple audio recording tool on my Mac. Since I need the raw audio buffers in-memory, I cannot use AVAudioRecorder, which would just write the recording to a file.

My approach is to create a AVCaptureSession, Input(Microphone) and Output(AVCaptureAudioDataOutput) and start the session. Everything works fine, however the delegate callback of the output is never called.

I made sure to add mic/camera permissions (just in case) in the project settings.

Maybe someone can help me with this!

Here's my code:

import Foundation
import AVFoundation

class AudioCaptureSession: NSObject, AVCaptureAudioDataOutputSampleBufferDelegate {

    let settings = [
        AVFormatIDKey: kAudioFormatMPEG4AAC,
        AVNumberOfChannelsKey : 1,
        AVSampleRateKey : 44100]
    let captureSession = AVCaptureSession()

    override init() {
        super.init()

        let queue = DispatchQueue(label: "AudioSessionQueue", attributes: [])
        let captureDevice = AVCaptureDevice.default(for: AVMediaType.audio)
        var audioInput : AVCaptureDeviceInput? = nil
        var audioOutput : AVCaptureAudioDataOutput? = nil

        do {
            try captureDevice?.lockForConfiguration()
            audioInput = try AVCaptureDeviceInput(device: captureDevice!)
            captureDevice?.unlockForConfiguration()
            audioOutput = AVCaptureAudioDataOutput()
            audioOutput?.setSampleBufferDelegate(self, queue: queue)
            audioOutput?.audioSettings = settings
        } catch {
            print("Capture devices could not be set")
            print(error.localizedDescription)
        }

        if audioInput != nil && audioOutput != nil {
            captureSession.beginConfiguration()
            if (captureSession.canAddInput(audioInput!)) {
                captureSession.addInput(audioInput!)
            } else {
                print("cannot add input")
            }
            if (captureSession.canAddOutput(audioOutput!)) {
                captureSession.addOutput(audioOutput!)
            } else {
                print("cannot add output")
            }
            captureSession.commitConfiguration()

            print("Starting capture session")
            captureSession.startRunning()
        }
    }

    func captureOutput(_ output: AVCaptureOutput,
                       didOutput sampleBuffer: CMSampleBuffer,
                       from connection: AVCaptureConnection) {

        print("Audio data recieved")
    }
}

回答1:


It's called for me. You don't show how you use it, but maybe your AudioCaptureSession is going out of scope and being deallocated.



来源:https://stackoverflow.com/questions/49598691/macos-swift-capture-audio-with-avcapturesession

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!