Raw image data from camera like “645 PRO”

后端 未结 4 1452
庸人自扰
庸人自扰 2020-11-30 05:09

A time ago I already asked this question and I also got a good answer:

I\'ve been searching this forum up and down but I couldn\'t find what I reall

4条回答
  •  囚心锁ツ
    2020-11-30 05:40

    Wow, that blog post was something special. A whole lot of words to just state that they get the sample buffer bytes that Apple hands you back from a still image. There's nothing particularly innovative about their approach, and I know a number of camera applications that do this.

    You can get at the raw bytes returned from a photo taken with a AVCaptureStillImageOutput using code like the following:

    [photoOutput captureStillImageAsynchronouslyFromConnection:[[photoOutput connections] objectAtIndex:0] completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
        CVImageBufferRef cameraFrame = CMSampleBufferGetImageBuffer(imageSampleBuffer);
        CVPixelBufferLockBaseAddress(cameraFrame, 0);
        GLubyte *rawImageBytes = CVPixelBufferGetBaseAddress(cameraFrame);
        size_t bytesPerRow = CVPixelBufferGetBytesPerRow(cameraFrame);
        NSData *dataForRawBytes = [NSData dataWithBytes:rawImageBytes length:bytesPerRow * CVPixelBufferGetHeight(cameraFrame)];
        // Do whatever with your bytes
    
        CVPixelBufferUnlockBaseAddress(cameraFrame, 0);
    }];
    

    This will give you an NSData instance containing the uncompressed BGRA bytes returned from the camera. You can save these to disk or do whatever you want with them. If you really need to process the bytes themselves, I'd avoid the overhead of the NSData creation and just work with the byte array from the pixel buffer.

提交回复
热议问题