how to convert a CVImageBufferRef to UIImage

后端 未结 4 1237
别跟我提以往
别跟我提以往 2020-11-28 21:52

I am trying to capture video from a camera. i have gotten the captureOutput:didOutputSampleBuffer: callback to trigger and it gives me a sample buffer that i th

4条回答
  •  死守一世寂寞
    2020-11-28 22:25

    The way that you are passing on the baseAddress presumes that the image data is in the form

    ACCC

    ( where C is some color component, R || G || B ).

    If you've set up your AVCaptureSession to capture the video frames in native format, more than likely you're getting the video data back in planar YUV420 format. (see: link text ) In order to do what you're attempting to do here, probably the easiest thing to do would be specify that you want the video frames captured in kCVPixelFormatType_32RGBA . Apple recommends that you capture the video frames in kCVPixelFormatType_32BGRA if you capture it in non-planar format at all, the reasoning for which is not stated, but I can reasonably assume is due to performance considerations.

    Caveat: I've not done this, and am assuming that accessing the CVPixelBufferRef contents like this is a reasonable way to build the image. I can't vouch for this actually working, but I /can/ tell you that the way you are doing things right now reliably will not work due to the pixel format that you are (probably) capturing the video frames as.

提交回复
热议问题