CMSampleBufferRef to bitmap?

孤人 提交于 2019-12-11 09:46:34

问题


I'm playing around with the AVScreenShack example from Apple's website (Xcode project) which captures the desktop and displays the capture in a window in quasi real-time.

I have modified the project a little bit and inserted this line of code:

-(void)captureOutput:(AVCaptureOutput*) captureOutput didOutputSampleBuffer:(CMSampleBufferRef) sampleBuffer fromConnection:(AVCaptureConnection*) connection
{
 ...
}

My question is : How do I convert the CMSampleBufferRef instance to CGImageRef ?

Thank you.


回答1:


Here is how you can create a UIImage from a CMSampleBufferRef. This worked for me when responding to captureStillImageAsynchronouslyFromConnection:completionHandler: on AVCaptureStillImageOutput.

// CMSampleBufferRef imageDataSampleBuffer;
CMBlockBufferRef buff = CMSampleBufferGetDataBuffer(imageDataSampleBuffer);
size_t len = CMBlockBufferGetDataLength(buff);
char * data = NULL;
CMBlockBufferGetDataPointer(buff, 0, NULL, &len, &data);
NSData * d = [[NSData alloc] initWithBytes:data length:len];
UIImage * img = [[UIImage alloc] initWithData:d];

It looks like the data coming out of CMBlockBufferGetDataPointer is JPEG data.

UPDATE: To fully answer your question, you can call CGImage on img from my code to actually get a CGImageRef.



来源:https://stackoverflow.com/questions/19352026/cmsamplebufferref-to-bitmap

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!