Using CIFilter with AVFoundation (iOS)

放肆的年华 提交于 2019-12-20 10:09:50

问题


I am trying to apply filters to a video composition created with AVFoundation on iOS (filters could be, eg, blur, pixelate, sepia, etc). I need to both apply the effects in real-time and be able to render the composite video out to disk, but I'm happy to start with just one or the other.

Unfortunately, I can't seem to figure this one out. Here's what I can do:

  • I can add a layer for animation to the UIView that's playing the movie, but it's not clear to me if I can process the incoming video image this way.
  • I can add an array of CIFilters to the AVPlayerLayer, but it turns out these are ignored in iOS (it only works on Mac OS X).
  • I can add an AVVideoCompositionCoreAnimationTool to the AVVideoCompopsition, but I'm not sure this would accomplish video processing (rather than animation) and it crashes with a message about not being designed for real-time playback anyway. I believe this is the solution for rendering animation when rendering to disk.

Other apps do this (I think), so I assume I'm missing something obvious.

note: I've looked into GPUImage and I'd love to use it, but it just doesn't work well with movies, especially movies with audio. See for example:

  • GPUImage filters in runtime on AVMutableComposition
  • https://github.com/BradLarson/GPUImage/issues/1339

回答1:


You could use the AVVideoCompositing and AVAsynchronousVideoCompositionRequest protocol to implement a custom compositor.

CVPixelBufferRef pixelBuffer = [AVAsynchronousVideoCompositionRequest sourceFrameByTrackID:trackID];
CIImage *theImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];
CIImage *motionBlurredImage = [[CIFilter *filterWithName:@"CIMotionBlur" keysAndValues:@"inputImage", theImage, nil] valueForKey:kCIOutputImageKey];
CIContext *someCIContext = [CIContext contextWithEAGLContext:eaglContext];
[someCIContext render:motionBlurredImage toCVPixelBuffer:outputBuffer];

Then render the pixel buffer using OpenGL as described in Apple's Documentation. This would allow you to implement any number of transitions or filters that you want. You can then set the AVAssetExportSession.videoCompostion and you will be able to export the composited video to disk.




回答2:


You can read AVComposition (it's an AVAsset subclass) with AVAssetReader. Get pixelbuffers, pass it to CIFilter (setting it up so that it uses GPU for rendering (no color management etc.) and render it on screen/output buffer depending on your needs. I do not think that Blur can be achieved realtime unless you use directly GPU.

You can read about CIFilter application to video (Applying Filter to Video section):

https://developer.apple.com/library/ios/documentation/graphicsimaging/conceptual/CoreImaging/ci_tasks/ci_tasks.html#//apple_ref/doc/uid/TP30001185-CH3-BAJDAHAD



来源:https://stackoverflow.com/questions/20627145/using-cifilter-with-avfoundation-ios

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!