Proper way to optimize my AVCaptureSession?

霸气de小男生 提交于 2020-01-01 03:44:07

问题


I got my AVCaptureSession to work and it duplicates the Camera.app UI almost perfectly, however, after a few seconds the application will crash and I just cannot find what I'm doing wrong. I really hope someone knows how to optimize this!

I AM using ARC; and again, the whole session runs fine but crashes after a little bit. The AVCaptureSession delegate method gets called what seems like EVERY second. If there's a way to call that method only when the user presses the "take picture" button, how can I do that while still maintaining the "live" preview layer?

Thanks in advance!

Setting up the session

NSError *error = nil;
session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
[session addInput:input];

output = [[AVCaptureVideoDataOutput alloc] init];
[session addOutput:output];

dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);

output.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
if(version >= 4.0 && version < 5.0) {
    output.minFrameDuration = CMTimeMake(1, 15);
}
output.alwaysDiscardsLateVideoFrames = YES;

previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer:previewLayer];
[self.view addSubview:camera_overlay];
[session startRunning];

AVCaptureSession Delegate that is being called:

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    UIImage *capture_image = [self imageFromSampleBuffer:sampleBuffer];
    return capture_image;
}

Method that gets the UIImage from sample buffer

- (UIImage *)imageFromSampleBuffer:(CMSampleBufferRef)sampleBuffer {

    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
    CVPixelBufferLockBaseAddress(imageBuffer, 0); 
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); 
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer); 
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 
    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
    CGImageRef quartzImage = CGBitmapContextCreateImage(context); 
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);
    CGContextRelease(context); 
    CGColorSpaceRelease(colorSpace);
    UIImage *image = [UIImage imageWithCGImage:quartzImage];
    CGImageRelease(quartzImage);

    return image;

}

回答1:


Take a look at the AVCam Demo app from Apple for a complete example.

The method

- (void)captureOutput:(AVCaptureOutput *)captureOutput 
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
fromConnection:(AVCaptureConnection *)connection {

is called every time a camera frame is ready, and in your case it is called 15 times a second, or at least should be called 15 times, since you specified the frame rate as output.minFrameDuration = CMTimeMake(1, 15);

From the code you provided the only reason I can think of is that you are not releasing the UIImage *capture_image

You can use the XCode Instruments to profile your application and see why that happens: Instruments Guide

The Leaks tool is your first stop in your case, there are many tutorials on the web for it, and here is one: Tracking iPhone Memory Leaks which was written a SO user OwenGross, if I'm not mistaken taken from here




回答2:


The post look pretty old but if anybody sees this:

Who are you returning the picture to in the delegate method (

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    UIImage *capture_image = [self imageFromSampleBuffer:sampleBuffer];
    return capture_image;
}

)?

You can use a button that raises a flag, In the delegate method check if the flag was raised and only then create the image. The image should be an instance variable else it will get lost in any case.

There are also delegate methods to capture an image captureStillImageAsynchronouslyFromConnection



来源:https://stackoverflow.com/questions/8237471/proper-way-to-optimize-my-avcapturesession

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!