问题
I'm playing around with the AVScreenShack example from Apple's website (Xcode project) which captures the desktop and displays the capture in a window in quasi real-time.
I have modified the project a little bit and inserted this line of code:
-(void)captureOutput:(AVCaptureOutput*) captureOutput didOutputSampleBuffer:(CMSampleBufferRef) sampleBuffer fromConnection:(AVCaptureConnection*) connection
{
...
}
My question is : How do I convert the CMSampleBufferRef instance to CGImageRef ?
Thank you.
回答1:
Here is how you can create a UIImage from a CMSampleBufferRef. This worked for me when responding to captureStillImageAsynchronouslyFromConnection:completionHandler: on AVCaptureStillImageOutput.
// CMSampleBufferRef imageDataSampleBuffer;
CMBlockBufferRef buff = CMSampleBufferGetDataBuffer(imageDataSampleBuffer);
size_t len = CMBlockBufferGetDataLength(buff);
char * data = NULL;
CMBlockBufferGetDataPointer(buff, 0, NULL, &len, &data);
NSData * d = [[NSData alloc] initWithBytes:data length:len];
UIImage * img = [[UIImage alloc] initWithData:d];
It looks like the data coming out of CMBlockBufferGetDataPointer is JPEG data.
UPDATE: To fully answer your question, you can call CGImage on img from my code to actually get a CGImageRef.
来源:https://stackoverflow.com/questions/19352026/cmsamplebufferref-to-bitmap