iOS4: how do I use video file as an OpenGL texture?

我的梦境 提交于 2019-11-26 10:19:07

问题


I\'m trying to display the contents of a video file (let\'s just say without the audio for now) onto a UV mapped 3D object in OpenGL. I\'ve done a fair bit in OpenGL but have no idea where to begin in video file handling, and most of the examples out there seems to be for getting video frames from cameras, which is not what I\'m after.

At the moment I feel if I can get individual frames of the video as CGImageRef I\'d be set, so I\'m wondering how to do this? Perhaps there are even be better ways to do this? Where should I start and what\'s the most straight forward file format for video playback on iOS? .mov?


回答1:


Apologies; typing on an iPhone so I'll be a little brief.

Create an AVURLAsset with the URL of your video - which can be a local file URL if you like. Anything QuickTime can do is fine, so MOV or M4V in H.264 is probably the best source.

Query the asset for tracks of type AVMediaTypeVideo. You should get just one unless your source video has multiple camera angles of something like that, so just taking objectAtIndex:0 should give you the AVAssetTrack you want.

Use that to create an AVAssetReaderTrackOutput. Probably you want to specify kCVPixelFormatType_32BGRA.

Create an AVAssetReader using the asset; attach the asset reader track output as an output. And call startReading.

Henceforth you can call copyNextSampleBuffer on the track output to get new CMSampleBuffers, putting you in the same position as if you were taking input from the camera. So you can lock that to get at pixel contents and push those to OpenGL via Apple's BGRA extension.




回答2:


You're probably going to have to use a player layer and flatten its contents into a bitmap context. See the documentation for AVPlayerLayer. The performance might be very poor though.



来源:https://stackoverflow.com/questions/5621627/ios4-how-do-i-use-video-file-as-an-opengl-texture

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!