How to get the Y component from CMSampleBuffer resulted from the AVCaptureSession?

前端 未结 4 1600
隐瞒了意图╮
隐瞒了意图╮ 2020-12-13 11:18

Hey there, I am trying to access raw data from iphone camera using AVCaptureSession. I follow the guide provided by Apple (link here).

The raw data from the samplebu

4条回答
  •  庸人自扰
    2020-12-13 12:15

    This is simply the culmination of everyone else's hard work, above and on other threads, converted to swift 3 for anyone that finds it useful.

    func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
        if let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) {
            CVPixelBufferLockBaseAddress(pixelBuffer, CVPixelBufferLockFlags.readOnly)
    
            let pixelFormatType = CVPixelBufferGetPixelFormatType(pixelBuffer)
            if pixelFormatType == kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
               || pixelFormatType == kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange {
    
                let bufferHeight = CVPixelBufferGetHeight(pixelBuffer)
                let bufferWidth = CVPixelBufferGetWidth(pixelBuffer)
    
                let lumaBytesPerRow = CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 0)
                let size = bufferHeight * lumaBytesPerRow
                let lumaBaseAddress = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0)
                let lumaByteBuffer = unsafeBitCast(lumaBaseAddress, to:UnsafeMutablePointer.self)
    
                let releaseDataCallback: CGDataProviderReleaseDataCallback = { (info: UnsafeMutableRawPointer?, data: UnsafeRawPointer, size: Int) -> () in
                    // https://developer.apple.com/reference/coregraphics/cgdataproviderreleasedatacallback
                    // N.B. 'CGDataProviderRelease' is unavailable: Core Foundation objects are automatically memory managed
                    return
                }
    
                if let dataProvider = CGDataProvider(dataInfo: nil, data: lumaByteBuffer, size: size, releaseData: releaseDataCallback) {
                    let colorSpace = CGColorSpaceCreateDeviceGray()
                    let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.noneSkipFirst.rawValue)
    
                    let cgImage = CGImage(width: bufferWidth, height: bufferHeight, bitsPerComponent: 8, bitsPerPixel: 8, bytesPerRow: lumaBytesPerRow, space: colorSpace, bitmapInfo: bitmapInfo, provider: dataProvider, decode: nil, shouldInterpolate: false, intent: CGColorRenderingIntent.defaultIntent)
    
                    let greyscaleImage = UIImage(cgImage: cgImage!)
                    // do what you want with the greyscale image.
                }
            }
    
            CVPixelBufferUnlockBaseAddress(pixelBuffer, CVPixelBufferLockFlags.readOnly)
        }
    }
    

提交回复
热议问题