core-image

How to apply a Vignette CIFilter to a live camera feed in iOS?

孤人 提交于 2019-11-26 08:34:45
问题 While trying to apply a simple vignette filter to the raw camera feed of an iPhone6, with the help of Metal and Core Image, I see a lot of lag between the frames being processed and rendered in an MTKView The approach which I have followed is (MetalViewController.swift): Get raw camera output using AVCaptureVideoDataOutputSampleBufferDelegate Convert CMSampleBuffer > CVPixelBuffer > CGImage Create an MTLTexture with this CGImage . Point no. 2 and 3 are inside the method named:

Are the Core Image filters in iOS 5.0 fast enough for realtime video processing?

我们两清 提交于 2019-11-26 05:29:08
问题 Now that Apple has ported the Core Image framework over to iOS 5.0, I\'m wondering: is Core Image is fast enough to apply live filters and effects to camera video? Also, what would be a good starting point to learn the Core Image framework for iOS 5.0? 回答1: Now that Core Image has been out on iOS for a while, we can talk about some hard performance numbers. I created a benchmark application as part of the testing for my GPUImage framework, and profiled the performance of raw CPU-based filters

Creating a blurring overlay view

夙愿已清 提交于 2019-11-26 01:29:16
问题 In the Music app of the new iOS, we can see an album cover behind a view that blurs it. How can something like that be accomplished? I\'ve read the documentation, but did not find anything there. 回答1: You can use UIVisualEffectView to achieve this effect. This is a native API that has been fine-tuned for performance and great battery life, plus it's easy to implement. Swift: //only apply the blur if the user hasn't disabled transparency effects if !UIAccessibility.isReduceTransparencyEnabled