How can I capture an image from iOS camera without user interaction?

前端 未结 1 367
-上瘾入骨i
-上瘾入骨i 2020-12-17 07:10

I had the need to capture a still image from the front facing camera and store it in the Documents directory. I found bits and pieces of code on other posts, but wanted to

相关标签:
1条回答
  • 2020-12-17 07:39

    Define a UIImage property and make sure your class implements the AVCaptureVideoDataOutputSampleBufferDelegate protocol:

    @interface ViewController : UIViewController <AVCaptureVideoDataOutputSampleBufferDelegate>
    
    @property (nonatomic, strong) UIImage *theImage;
    
    @end
    

    In viewDidLoad or something appropriate, add this:

        [self captureImage];
    

    Implement the following methods:

    - (void)captureImage
    {
        AVCaptureSession *session = [[AVCaptureSession alloc] init];
        session.sessionPreset = AVCaptureSessionPresetHigh;
    
        AVCaptureDevice *device = [self frontCamera];
    
    
        NSError *error = nil;
        AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
        if (!input) {
            // Handle the error appropriately.
            NSLog(@"no input.....");
        }
        [session addInput:input];
    
        AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
        [session addOutput:output];
        output.videoSettings = @{ (NSString *)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) };
    
        dispatch_queue_t queue = dispatch_queue_create("MyQueue", NULL);
    
        [output setSampleBufferDelegate:self queue:queue];
    
        [session startRunning];
    
        [session stopRunning];
    }
    
    - (AVCaptureDevice *)frontCamera {
        NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
        for (AVCaptureDevice *device in devices) {
            if ([device position] == AVCaptureDevicePositionFront) {
            return device;
            }
        }
        return nil;
    }
    
    - (void)captureOutput:(AVCaptureOutput *)captureOutput
    didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
           fromConnection:(AVCaptureConnection *)connection {
    
    
        CGImageRef cgImage = [self imageFromSampleBuffer:sampleBuffer];
        self.theImage = [UIImage imageWithCGImage: cgImage ];
        CGImageRelease( cgImage );
    
        NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
        NSString *documentsPath = [paths objectAtIndex:0];
        NSString *filePath = [documentsPath stringByAppendingPathComponent:@"image.png"];
        NSData* data = UIImagePNGRepresentation(self.theImage);
        [data writeToFile:filePath atomically:YES];
    }
    
    - (CGImageRef) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
    {
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        CVPixelBufferLockBaseAddress(imageBuffer,0);
        uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0);
        size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
        size_t width = CVPixelBufferGetWidth(imageBuffer);
        size_t height = CVPixelBufferGetHeight(imageBuffer);
        CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    
        CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
        CGImageRef newImage = CGBitmapContextCreateImage(newContext);
        CGContextRelease(newContext);
    
        CGColorSpaceRelease(colorSpace);
        CVPixelBufferUnlockBaseAddress(imageBuffer,0);
    
        return newImage;
    }
    
    0 讨论(0)
提交回复
热议问题