I am trying to develop an iPhone app that processes/filters and records video.
I have two sample apps that have aspects of what I need and am trying to combine them.
AVCamDemo from the WWDC10 sample code package (Apple Developer ID required)
This deals with capturing/recording video.Brad Larson's ColorTracking sample app referenced here
This deals with live processing of video using OpenGL ES
I get stuck when trying to combine the two.
What I have been trying to do is to use AVCaptureVideoOutput and the AVCaptureVideoDataOutputSampleBufferProtocol to process/filter the video frames through OpenGL ES (as in -2-), and at the same time somehow use AVCaptureMovieFileOutput to record the processed video (as in -1-).
Is this approach possible? If so, how would I need to set up the connections within the AVSession?
Or do I need to use the AVCaptureVideoDataOutputSampleBufferProtocol to process/filter the video AND then recombine the individual frames back into a movie - without using AVCaptureMovieFileOutput to save the movie file?
Any suggestions for the best approach to accomplish this are much appreciated!
http://developer.apple.com/library/ios/#qa/qa1702/_index.html
you can try this link to Capture Video using AV Foundation, very Helpful. Hope your one problem will be resolved
来源:https://stackoverflow.com/questions/5228406/capturing-video-while-processing-it-through-a-shader-on-iphone