I am currently attempting to draw an image in openGL using YUV420 format (bi-planar). I receive raw data, and am attempting to parse it into a CVPixelBuffer, and then pass
Use CVPixelBufferCreate
if you are going to use the CVPixelBufferRef
with OpenGL. It creates an iosurface for you, unlike the WithBytes
alternatives. The downside is that you can't reuse your existing buffers. You'll have to copy the data from your existing buffers into the newly allocated buffers.
// set pixel buffer attributes so we get an iosurface
NSDictionary *pixelBufferAttributes = [NSDictionary dictionaryWithObjectsAndKeys:
[NSDictionary dictionary], kCVPixelBufferIOSurfacePropertiesKey,
nil];
// create planar pixel buffer
CVPixelBufferRef pixelBuffer = nil;
CVPixelBufferCreate(kCFAllocatorDefault, bufferYUV.width, bufferYUV.height, kCVPixelFormatType_420YpCbCr8BiPlanarFullRange, (CFDictionaryRef)pixelBufferAttributes, &pixelBuffer);
// lock pixel buffer
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
// get image details
size_t width = CVPixelBufferGetWidth(pixelBuffer);
size_t height = CVPixelBufferGetHeight(pixelBuffer);
// get plane addresses
unsigned char *baseAddressY = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);
unsigned char *baseAddressUV = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1);
//TODO: copy your data buffers to the newly allocated memory locations
// unlock pixel buffer address
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
// intialize buffers if not already initialized (see GLCameraRipple example)
if (!_buffersInitialized)
{
[self initializeBuffersWithTextureWidth:width textureHeight:height];
}
// always clean up last textures
CVReturn err;
[self cleanUpTextures];
// Y-plane
glActiveTexture(GL_TEXTURE0);
err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, _videoTextureCache, pixelBuffer, NULL, GL_TEXTURE_2D, GL_RED_EXT, width, height, GL_RED_EXT, GL_UNSIGNED_BYTE, 0, &_lumaTexture);
if (err)
{
NSLog(@"Could not create Y texture from image. %d", err);
}
glBindTexture(CVOpenGLESTextureGetTarget(_lumaTexture), CVOpenGLESTextureGetName(_lumaTexture));
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
// UV-plane
glActiveTexture(GL_TEXTURE1);
err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, _videoTextureCache, pixelBuffer, NULL, GL_TEXTURE_2D, GL_RG_EXT, width / 2, height / 2, GL_RG_EXT, GL_UNSIGNED_BYTE, 1, &_chromaTexture);
if (err)
{
NSLog(@"Could not create UV texture from image. %d", err);
}
glBindTexture(CVOpenGLESTextureGetTarget(_chromaTexture), CVOpenGLESTextureGetName(_chromaTexture));
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
The iosurface property is null in the CVPixelBuffer you've created.
Created manually:
<CVPixelBuffer 0x1fd52790 width=1280 height=720 pixelFormat=420v iosurface=0x0 planes=2>
Created by CMSampleBufferGetImageBuffer:
<CVPixelBuffer 0x1fd521e0 width=1280 height=720 pixelFormat=420f iosurface=0x21621c54 planes=2>
To my knowledge there is no solution.
I don't try the following approach on YUV, but it works on RGB case
https://developer.apple.com/library/ios/qa/qa1781/_index.html
add __bridge before CFDictionaryRef if ARC enabled.