I am currently writing an application that read frames from camera, modify them, and save them into a video file. I\'m planing to do it with ffmpeg. There\'s rarely a docume
You can do what you require without using a library, as in unix you can pipe RGBA data into a program, so you can do:
In your program:
char myimage[640*480*4];
// read data into myimage
fputs(myimage,1,640*480*4,stdout);
And in a script that runs your program:
./myprogram | \
mencoder /dev/stdin -demuxer rawvideo -rawvideo w=640:h=480:fps=30:format=rgba \
-ovc lavc -lavcopts vcodec=mpeg4:vbitrate=9000000 \
-oac copy -o output.avi
I believe you can also use ffmpeg this way, or x264. You can also start the encoder from within your program, and write to a pipe (making the whole process as simple if you were using a library).
While not quite what you want, and not suitable for iPhone development, it does have the advantage that Unix will automatically use a second processor for the encoding.