How to Convert CMSampleBuffer/UIImage into ffmpeg's AVPicture?

自作多情 提交于 2019-12-03 04:36:47

问题


I'm trying to encode iPhone's camera frames into a H.264 video using ffmpeg's libav* libraries. I found in this Apple's article how to convert CMSampleBuffer to UIImage, but how can I convert it to ffmpeg's AVPicture?

Thanks.


回答1:


Answering my own question:

CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(pixelBuffer, 0);

// access the data
int width = CVPixelBufferGetWidth(pixelBuffer);
int height = CVPixelBufferGetHeight(pixelBuffer);
unsigned char *rawPixelBase = (unsigned char *)CVPixelBufferGetBaseAddress(pixelBuffer);

// Do something with the raw pixels here
// ...

// Fill in the AVFrame
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);

AVFrame *pFrame;
pFrame = avcodec_alloc_frame();

avpicture_fill((AVPicture*)pFrame, rawPixelBase, PIX_FMT_RGB32, width, height);

Now pFrame is filled in with the content of sample buffer, which is using the pixel format kCVPixelFormatType_32BGRA.

This solved my issue. Thanks.



来源:https://stackoverflow.com/questions/4499160/how-to-convert-cmsamplebuffer-uiimage-into-ffmpegs-avpicture

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!