convert CMSampleBufferRef to UIImage

前端 未结 2 1006
Happy的楠姐
Happy的楠姐 2020-12-18 11:47

I\'m captuting video with AVCaptureSession. But I would like to convert the captured image to an UIImage.

I found some code on Internet:

- (UIImage *         


        
相关标签:
2条回答
  • 2020-12-18 12:47

    You can try this code.

    -(UIImage *) screenshotOfVideoStream:(CMSampleBufferRef)samImageBuff
      {
           CVImageBufferRef imageBuffer =  
             CMSampleBufferGetImageBuffer(samImageBuff);
           CIImage *ciImage = [CIImage imageWithCVPixelBuffer:imageBuffer];
           CIContext *temporaryContext = [CIContext contextWithOptions:nil];
           CGImageRef videoImage = [temporaryContext
                             createCGImage:ciImage
                             fromRect:CGRectMake(0, 0,
                             CVPixelBufferGetWidth(imageBuffer),
                             CVPixelBufferGetHeight(imageBuffer))];
    
           UIImage *image = [[UIImage alloc] initWithCGImage:videoImage];
           CGImageRelease(videoImage);
          return image;
    }
    

    It work for me.

    0 讨论(0)
  • 2020-12-18 12:50

    Your best bet will be to set the capture video data output's videoSettings to a dictionary that specifies the pixel format you want, which you'll need to set to some variation on RGB that CGBitmapContext can handle.

    The documentation has a list of all of the pixel formats that Core Video can process. Only a tiny subset of those are supported by CGBitmapContext. The format that the code you found on the internet is expecting is kCVPixelFormatType_32BGRA, but that might have been written for Macs—on iOS devices, kCVPixelFormatType_32ARGB (big-endian) might be faster. Try them both, on the device, and compare frame rates.

    0 讨论(0)
提交回复
热议问题