问题
Im working on AVFoundation framework using Swift programming. Can someone help me find some tutorial or links or code snippet to apply video filter similar to what Instagram does.
I'm working on iOS video creator kind of app which records video and later i can apply filter to video.
Thanks in advance.
回答1:
You should check out Apple's sample project RosyWriter. Its a good example for what you want to achieve.
https://developer.apple.com/library/prerelease/ios/samplecode/RosyWriter/Introduction/Intro.html#//apple_ref/doc/uid/DTS40011110
Additionally you could also check out the GLImageProcessing sample project.
https://developer.apple.com/library/ios/samplecode/GLImageProcessing/Introduction/Intro.html
Hope that helps!
回答2:
Since iOS 9.0 you can use AVVideoComposition to apply core image filter to video frame by frame.
let filter = CIFilter(name: "CIGaussianBlur")!
let composition = AVVideoComposition(asset: asset, applyingCIFiltersWithHandler: { request in
// Clamp to avoid blurring transparent pixels at the image edges
let source = request.sourceImage.imageByClampingToExtent()
filter.setValue(source, forKey: kCIInputImageKey)
// Vary filter parameters based on video timing
let seconds = CMTimeGetSeconds(request.compositionTime)
filter.setValue(seconds * 10.0, forKey: kCIInputRadiusKey)
// Crop the blurred output to the bounds of the original image
let output = filter.outputImage!.imageByCroppingToRect(request.sourceImage.extent)
// Provide the filter output to the composition
request.finishWithImage(output, context: nil)
// Clamp to avoid blurring transparent pixels at the image edges
let source = request.sourceImage.clampedToExtent()
filter.setValue(source, forKey: kCIInputImageKey)
//Vary filter parameters based on video timing
let seconds = CMTimeGetSeconds(request.compositionTime)
filter.setValue(seconds * 10.0, forKey: kCIInputRadiusKey)
// Crop the blurred output to the bounds of the original image
let output = filter.outputImage!.cropped(to: request.sourceImage.extent)
request.finish(with: output, context: nil)
})
now here we can create AVPlayerItem using the asset created earlier and play it using AVPlayer
let playerItem = AVPlayerItem(asset: asset)
playerItem.videoComposition = composition
let player = AVPlayer(playerItem: playerItem)
player.play()
core image filter added realtime frame by frame. You can also export video using AVAssetExportSession class.
here is WWDC 2015 great introduction: https://developer.apple.com/videos/play/wwdc2015/510/?time=1222
来源:https://stackoverflow.com/questions/33689050/how-to-create-and-add-video-filter-like-instagram-using-avfoundation-framework