Convert a CMSampleBuffer into a UIImage

痞子三分冷 提交于 2019-12-10 16:27:43

问题


Here's a function (code from Apple documentation) that converts a CMSampleBuffer into a UIImage

func imageFromSampleBuffer(sampleBuffer: CMSampleBuffer) -> UIImage {
    // Get a CMSampleBuffer's Core Video image buffer for the media data
    var imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
    // Lock the base address of the pixel buffer
    CVPixelBufferLockBaseAddress(imageBuffer, 0)

    // Get the number of bytes per row for the pixel buffer
    var baseAddress = CVPixelBufferGetBaseAddress(imageBuffer)

    // Get the number of bytes per row for the pixel buffer
    var bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer)
    // Get the pixel buffer width and height
    var width = CVPixelBufferGetWidth(imageBuffer)
    var height = CVPixelBufferGetHeight(imageBuffer)

    // Create a device-dependent RGB color space
    var colorSpace = CGColorSpaceCreateDeviceRGB()

    // Create a bitmap graphics context with the sample buffer data
    let bitmapInfo = CGBitmapInfo(CGImageAlphaInfo.NoneSkipLast.rawValue)
    var context = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, bitmapInfo)
    // Create a Quartz image from the pixel data in the bitmap graphics context
    var quartzImage = CGBitmapContextCreateImage(context);
    // Unlock the pixel buffer
    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

    // Create an image object from the Quartz image
    var image = UIImage(CGImage: quartzImage)!

    return image
}

When I try to visualize the UIImage using a UIImageView, I get nothing.
Any Ideas?


回答1:


This is a solution for Swift 3.0, where CMSampleBuffer is extended, creating a variable that gives you an optional UIImage.

import AVFoundation

extension CMSampleBuffer {
    var uiImage: UIImage? {
        guard let imageBuffer = CMSampleBufferGetImageBuffer(self) else { return nil }

        CVPixelBufferLockBaseAddress(imageBuffer, CVPixelBufferLockFlags(rawValue: 0))
        let baseAddress = CVPixelBufferGetBaseAddress(imageBuffer)
        let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer)
        let width = CVPixelBufferGetWidth(imageBuffer)
        let height = CVPixelBufferGetHeight(imageBuffer)
        let colorSpace = CGColorSpaceCreateDeviceRGB()
        let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.noneSkipFirst.rawValue | CGBitmapInfo.byteOrder32Little.rawValue)
        guard let context = CGContext(data: baseAddress,
                                      width: width,
                                      height: height,
                                      bitsPerComponent: 8,
                                      bytesPerRow: bytesPerRow,
                                      space: colorSpace,
                                      bitmapInfo: bitmapInfo.rawValue) else { return nil }
        guard let cgImage = context.makeImage() else { return nil }

        CVPixelBufferUnlockBaseAddress(imageBuffer,CVPixelBufferLockFlags(rawValue: 0));

        return UIImage(cgImage: cgImage)
    }
}



回答2:


I just finished the exact same function in my current project, and here's how I got it to work (with a lot of googling and some trial-and-errors):

let bitmapInfo = CGBitmapInfo(CGImageAlphaInfo.NoneSkipFirst.rawValue | CGBitmapInfo.ByteOrder32Little.rawValue)

Also, make sure you present the UIImageView in the main thread (you are probably in the camera session thread to get the CMSampleBuffer), because UIKit can only be executed in main thread. Otherwise, you'll have to wait very, very long for the image to show.




回答3:


@Zigglzworth Need setting kCVPixelFormatType_32BGRA for captureVideoDataOutput

        let videoOutput = AVCaptureVideoDataOutput()
        videoOutput.videoSettings = [kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_32BGRA]


来源:https://stackoverflow.com/questions/27962944/convert-a-cmsamplebuffer-into-a-uiimage

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!