kCVPixelFormatType_420YpCbCr8BiPlanarFullRange frame to UIImage conversion

后端 未结 3 512
猫巷女王i
猫巷女王i 2020-12-01 05:39

I have an app that capture live video in kCVPixelFormatType_420YpCbCr8BiPlanarFullRange format to process Y channel. According to Apple\'s documentation:

3条回答
  •  醉梦人生
    2020-12-01 06:33

    Most implementations I found (including the previous answer here) won't work if you change videoOrientation in the AVCaptureConnection (for some reason I don't fully understand, the CVPlanarPixelBufferInfo_YCbCrBiPlanar struct will be empty in that case), so I wrote one that does (most of the code was based on this answer). My implementation also adds an empty alpha channel to the RGB buffer and creates the CGBitmapContext using the kCGImageAlphaNoneSkipLast flag (there's no alpha data, but iOS seems to require 4 bytes per pixel). Here it is:

    #define clamp(a) (a>255?255:(a<0?0:a))
    
    - (UIImage *)imageFromSampleBuffer:(CMSampleBufferRef)sampleBuffer {
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        CVPixelBufferLockBaseAddress(imageBuffer,0);
    
        size_t width = CVPixelBufferGetWidth(imageBuffer);
        size_t height = CVPixelBufferGetHeight(imageBuffer);
        uint8_t *yBuffer = CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0);
        size_t yPitch = CVPixelBufferGetBytesPerRowOfPlane(imageBuffer, 0);
        uint8_t *cbCrBuffer = CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 1);
        size_t cbCrPitch = CVPixelBufferGetBytesPerRowOfPlane(imageBuffer, 1);
    
        int bytesPerPixel = 4;
        uint8_t *rgbBuffer = malloc(width * height * bytesPerPixel);
    
        for(int y = 0; y < height; y++) {
            uint8_t *rgbBufferLine = &rgbBuffer[y * width * bytesPerPixel];
            uint8_t *yBufferLine = &yBuffer[y * yPitch];
            uint8_t *cbCrBufferLine = &cbCrBuffer[(y >> 1) * cbCrPitch];
    
            for(int x = 0; x < width; x++) {
                int16_t y = yBufferLine[x];
                int16_t cb = cbCrBufferLine[x & ~1] - 128;
                int16_t cr = cbCrBufferLine[x | 1] - 128;
    
                uint8_t *rgbOutput = &rgbBufferLine[x*bytesPerPixel];
    
                int16_t r = (int16_t)roundf( y + cr *  1.4 );
                int16_t g = (int16_t)roundf( y + cb * -0.343 + cr * -0.711 );
                int16_t b = (int16_t)roundf( y + cb *  1.765);
    
                rgbOutput[0] = 0xff;
                rgbOutput[1] = clamp(b);
                rgbOutput[2] = clamp(g);
                rgbOutput[3] = clamp(r);
            }
        }
    
        CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
        CGContextRef context = CGBitmapContextCreate(rgbBuffer, width, height, 8, width * bytesPerPixel, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaNoneSkipLast);
        CGImageRef quartzImage = CGBitmapContextCreateImage(context);
        UIImage *image = [UIImage imageWithCGImage:quartzImage];
    
        CGContextRelease(context);
        CGColorSpaceRelease(colorSpace);
        CGImageRelease(quartzImage);
        free(rgbBuffer);
    
        CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
    
        return image;
    }
    

提交回复
热议问题