问题
I want to record video and grab frames at the same time with my code.
I am using AVCaptureVideoDataOutput for grab frames and AVCaptureMovieFileOutput for video recording. But can't work and get the error code -12780 while working at the same time but individual.
I searched this problem but get no answer. Did anyone have the same experience or explain? It's really bother me for a while time.
thanks.
回答1:
I can't answer the specific question put, but I've been successfully recording video and grabbing frames at the same time using:
AVCaptureSessionandAVCaptureVideoDataOutputto route frames into my own codeAVAssetWriter,AVAssetWriterInputandAVAssetWriterInputPixelBufferAdaptorto write frames out to an H.264 encoded movie file
That's without investigating audio. I end up getting CMSampleBuffers from the capture session and then pushing them into the pixel buffer adaptor. 
EDIT: so my code looks more or less like, with the bits you're having no problems with skimmed over and ignoring issues of scope:
/* to ensure I'm given incoming CMSampleBuffers */
AVCaptureSession *captureSession = alloc and init, set your preferred preset/etc;
AVCaptureDevice *captureDevice = default for video, probably;
AVCaptureDeviceInput *deviceInput = input with device as above, 
                                    and attach it to the session;
AVCaptureVideoDataOutput *output = output for 32BGRA pixel format, with me as the
                                   delegate and a suitable dispatch queue affixed.
/* to prepare for output; I'll output 640x480 in H.264, via an asset writer */
NSDictionary *outputSettings =
    [NSDictionary dictionaryWithObjectsAndKeys:
            [NSNumber numberWithInt:640], AVVideoWidthKey,
            [NSNumber numberWithInt:480], AVVideoHeightKey,
            AVVideoCodecH264, AVVideoCodecKey,
            nil];
AVAssetWriterInput *assetWriterInput = [AVAssetWriterInput 
                                   assetWriterInputWithMediaType:AVMediaTypeVideo
                                                  outputSettings:outputSettings];
/* I'm going to push pixel buffers to it, so will need a 
   AVAssetWriterPixelBufferAdaptor, to expect the same 32BGRA input as I've
   asked the AVCaptureVideDataOutput to supply */
AVAssetWriterInputPixelBufferAdaptor *pixelBufferAdaptor =
           [[AVAssetWriterInputPixelBufferAdaptor alloc] 
                initWithAssetWriterInput:assetWriterInput 
                sourcePixelBufferAttributes:
                     [NSDictionary dictionaryWithObjectsAndKeys:
                          [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], 
                           kCVPixelBufferPixelFormatTypeKey,
                     nil]];
/* that's going to go somewhere, I imagine you've got the URL for that sorted,
   so create a suitable asset writer; we'll put our H.264 within the normal
   MPEG4 container */
AVAssetWriter *assetWriter = [[AVAssetWriter alloc]
                                initWithURL:URLFromSomwhere
                                fileType:AVFileTypeMPEG4
                                error:you need to check error conditions,
                                      this example is too lazy];
[assetWriter addInput:assetWriterInput];
/* we need to warn the input to expect real time data incoming, so that it tries
   to avoid being unavailable at inopportune moments */
assetWriterInput.expectsMediaDataInRealTime = YES;
... eventually ...
[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:kCMTimeZero];
[captureSession startRunning];
... elsewhere ...
- (void)        captureOutput:(AVCaptureOutput *)captureOutput 
    didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
           fromConnection:(AVCaptureConnection *)connection
{
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    // a very dense way to keep track of the time at which this frame
    // occurs relative to the output stream, but it's just an example!
    static int64_t frameNumber = 0;
    if(assetWriterInput.readyForMoreMediaData)
        [pixelBufferAdaptor appendPixelBuffer:imageBuffer
                         withPresentationTime:CMTimeMake(frameNumber, 25)];
    frameNumber++;
}
... and, to stop, ensuring the output file is finished properly ...
[captureSession stopRunning];
[assetWriter finishWriting];
    回答2:
This is a swift version of Tommy's answer.
 // Set up the Capture Session 
 // Add the Inputs 
 // Add the Outputs
 var outputSettings = [
    AVVideoWidthKey : Int(640),
    AVVideoHeightKey : Int(480),
    AVVideoCodecKey : .h264
]
    var assetWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo,outputSettings: outputSettings)
    var pixelBufferAdaptor = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput, sourcePixelBufferAttributes:
        [ kCVPixelBufferPixelFormatTypeKey : Int(kCVPixelFormatType_32BGRA)])
     var assetWriter = AVAssetWriter(url: URLFromSomwhere, fileType: AVFileTypeMPEG4 , error : Error )
         assetWriter.addInput(assetWriterInput)
         assetWriterInput.expectsMediaDataInRealTime = true
         assetWriter.startWriting()
         assetWriter.startSession(atSourceTime: kCMTimeZero)
    captureSession.startRunning()
  func captureOutput(_ captureOutput: AVCaptureOutput, didOutputSampleBuffer sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
    var imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
    // a very dense way to keep track of the time at which this frame
    // occurs relative to the output stream, but it's just an example!
    var frameNumber: Int64 = 0
           if assetWriterInput.readyForMoreMediaData {
    pixelBufferAdaptor.appendPixelBuffer(imageBuffer, withPresentationTime: CMTimeMake(frameNumber, 25))
               }
                frameNumber += 1   }
      captureSession.stopRunning()
      assetWriter.finishWriting()
I don't gurantee a 100% accuracy though , because I'm new to swift.
来源:https://stackoverflow.com/questions/4944083/can-use-avcapturevideodataoutput-and-avcapturemoviefileoutput-at-the-same-time