I\'m looking for the fastest way to decode a local mpeg-4 video\'s frames on the iPhone. I\'m simply interested in the luminance values of the pixels in every 10th frame. I
Assuming the bottleneck of your application is in the code that converts the video frames to a displayable format (like RGB), you might be interested in a code I shared that was used to convert one .mp4 frame (encoded as YV12) to RGB using Qt and OpenGL. This application uploads the frame to the GPU and activates a GLSL fragment shader to do the conversion from YV12 to RGB, so it could be displayed in a QImage.
static const char *p_s_fragment_shader =
"#extension GL_ARB_texture_rectangle : enable\n"
"uniform sampler2DRect tex;"
"uniform float ImgHeight, chromaHeight_Half, chromaWidth;"
"void main()"
"{"
" vec2 t = gl_TexCoord[0].xy;" // get texcoord from fixed-function pipeline
" float CbY = ImgHeight + floor(t.y / 4.0);"
" float CrY = ImgHeight + chromaHeight_Half + floor(t.y / 4.0);"
" float CbCrX = floor(t.x / 2.0) + chromaWidth * floor(mod(t.y, 2.0));"
" float Cb = texture2DRect(tex, vec2(CbCrX, CbY)).x - .5;"
" float Cr = texture2DRect(tex, vec2(CbCrX, CrY)).x - .5;"
" float y = texture2DRect(tex, t).x;" // redundant texture read optimized away by texture cache
" float r = y + 1.28033 * Cr;"
" float g = y - .21482 * Cb - .38059 * Cr;"
" float b = y + 2.12798 * Cb;"
" gl_FragColor = vec4(r, g, b, 1.0);"
"}"