AVCapture capturing and getting framebuffer at 60 fps in iOS 7

前端 未结 3 938
無奈伤痛
無奈伤痛 2020-12-23 02:43

I\'m developping an app which requires capturing framebuffer at as much fps as possible. I\'ve already figured out how to force iphone to capture at 60 fps but



        
3条回答
  •  盖世英雄少女心
    2020-12-23 03:10

    I am getting samples at 60 fps on the iPhone 5 and 120 fps on the iPhone 5s, both when doing real time motion detection in captureOutput and when saving the frames to a video using AVAssetWriter.

    You have to set thew AVCaptureSession to a format that supports 60 fps:

    AVsession = [[AVCaptureSession alloc] init];
    
    AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    AVCaptureDeviceInput *capInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
    if (capInput) [AVsession addInput:capInput];
    
    for(AVCaptureDeviceFormat *vFormat in [videoDevice formats] ) 
    {
        CMFormatDescriptionRef description= vFormat.formatDescription;
        float maxrate=((AVFrameRateRange*)[vFormat.videoSupportedFrameRateRanges objectAtIndex:0]).maxFrameRate;
    
        if(maxrate>59 && CMFormatDescriptionGetMediaSubType(description)==kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)
        {
            if ( YES == [videoDevice lockForConfiguration:NULL] ) 
            {
               videoDevice.activeFormat = vFormat;
               [videoDevice setActiveVideoMinFrameDuration:CMTimeMake(10,600)];
               [videoDevice setActiveVideoMaxFrameDuration:CMTimeMake(10,600)];
               [videoDevice unlockForConfiguration];
               NSLog(@"formats  %@ %@ %@",vFormat.mediaType,vFormat.formatDescription,vFormat.videoSupportedFrameRateRanges);
            }
         }
    }
    
    prevLayer = [AVCaptureVideoPreviewLayer layerWithSession: AVsession];
    prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    [self.view.layer addSublayer: prevLayer];
    
    AVCaptureVideoDataOutput *videoOut = [[AVCaptureVideoDataOutput alloc] init];
    dispatch_queue_t videoQueue = dispatch_queue_create("videoQueue", NULL);
    [videoOut setSampleBufferDelegate:self queue:videoQueue];
    
    videoOut.videoSettings = @{(id)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA)};
    videoOut.alwaysDiscardsLateVideoFrames=YES;
    
    if (videoOut)
    {
        [AVsession addOutput:videoOut];
        videoConnection = [videoOut connectionWithMediaType:AVMediaTypeVideo];
    }
    

    Two other comment if you want to write to a file using AVAssetWriter. Don't use the pixelAdaptor, just ad the samples with

    [videoWriterInput appendSampleBuffer:sampleBuffer]
    

    Secondly when setting up the assetwriter use

    [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
                                       outputSettings:videoSettings 
                                     sourceFormatHint:formatDescription];
    

    The sourceFormatHint makes a difference in writing speed.

提交回复
热议问题