core-image

What's up with CITemperatureAndTint having vector inputs?

大城市里の小女人 提交于 2019-11-29 02:17:01
OK, so the Core Image filter Temperature and Tint has two inputs, neutral and targetNeutral. However, my biggest issue is the fact that they're both two-component vectors, meaning each has two numeric inputs. I would expect the first to be from say 2500 to 10000. What would the vector be for? The essential purpose of performing temperature and tint adjustment is to correct the white balance of a captured image: to account for the ambient illumination of the scene and adjust colors so that the image appears more like it was shot in "white" light (roughly 6500K). Temperature relates to the

Applying a CIFilter to a CALayer

↘锁芯ラ 提交于 2019-11-29 00:25:47
CI Filters are now available in iOS 5, and I'm trying to apply one to a CALayer, the way you'd do it on Mac . Here's my code: CALayer *myCircle = [CALayer layer]; myCircle.bounds = CGRectMake(0,0,30,30); myCircle.position = CGPointMake(100,100); myCircle.cornerRadius = 15; myCircle.borderColor = [UIColor whiteColor].CGColor; myCircle.borderWidth = 2; myCircle.backgroundColor = [UIColor whiteColor].CGColor; CIFilter *blurFilter = [CIFilter filterWithName:@"CIDiscBlur"]; [blurFilter setDefaults]; [blurFilter setValue:[NSNumber numberWithFloat:5.0f] forKey:@"inputRadius"]; [myCircle setFilters:

iOS White point/white balance adjustment examples/suggestions

老子叫甜甜 提交于 2019-11-28 23:29:12
问题 I am trying to change the white point/white balance programmatically. This is what I want to accomplish: - Choose a (random) pixel from the image - Get color of that pixel - Transform the image so that all pixels of that color will be transformed to white and all other colors shifted to match I have accomplished the first two steps but the third step is not really working out. At first I thought that, as per Apples documentation CIWhitePointAdjust should be the thing to accomplish exactly

How can I map Photoshop's level adjustment to a Core Image filter?

泪湿孤枕 提交于 2019-11-28 20:55:49
I'm mapping several photoshop elements to CIFilter, the only one I'm having trouble with is this Levels Adjustment: Which CI Filter (or combination of filters) would let me utilize the 16, 1.73, 239 & 39/245 above in the first example or the 31, 1.25, 255 30/255 in the second example. I believe this is a kind of shadow/black and white level adjustment. Any help appreciated. jakber By adapting the formula from this page: http://http.developer.nvidia.com/GPUGems/gpugems_ch22.html , I believe you can do this using a combination of CIColorMatrix , CIGammaAdjust and another CIColorMatrix . Let's

Using GPUImage to Recreate iOS 7 Glass Effect

你离开我真会死。 提交于 2019-11-28 17:06:45
I am trying to use the iOS 7 style glass effect in my glass by applying image effects to a screenshot of a MKMapView . This UIImage category , provided by Apple, is what I am using as a baseline. This method desaturates the source image, applies a tint color, and blurs heavily using the input vals: [image applyBlurWithRadius:10.0 tintColor:[UIColor colorWithRed:229/255.0f green:246/255.0f blue:255/255.0f alpha:0.33] saturationDeltaFactor:0.66 maskImage:nil]; This produces the effect I am looking for, but takes way too long — between .3 and .5 seconds to render on an iPhone 4. I would like to

CIFilter output image nil

独自空忆成欢 提交于 2019-11-28 12:11:15
I am using core image and I am applying a CIFilter sepia tone to my image. I run a filter once in viewDidLoad and then immediately call another function that adds the filter again. For some reason, when I try to access the output image, the app crashes and says the output image is nil. Anyone know why this is happening? Thanks import UIKit class ViewController: UIViewController { @IBOutlet weak var myimage: UIImageView! override func viewDidLoad() { super.viewDidLoad() // Do any additional setup after loading the view, typically from a nib. let image = CIImage(image: myimage.image) let filter

Convert RGB image to 1 channel image (black/white)

岁酱吖の 提交于 2019-11-28 11:48:55
How can I convert RGB image to 1 channel image (black/white) using ios5? Input image is usually a photo of a book page. Goal is to reduce the size of a photocopy by converting it to the 1 channel image. If I understand your question, you want to apply a black and white thresholding to the image based on a pixel's luminance. For a fast way of doing this, you could use my open source GPUImage project (supporting back to iOS 4.x) and a couple of the image processing operations it provides. In particular, the GPUImageLuminanceThresholdFilter and GPUImageAdaptiveThresholdFilter might be what you're

CIGaussianBlur and CIAffineClamp on iOS 6

故事扮演 提交于 2019-11-28 11:12:07
I am trying to blur an image using CoreImage on iOS 6 without having a noticeable black border. Apple documentation states that using a CIAffineClamp filter can achieve this but I'm not able to get an output image from the filter. Here's what I tried, but unfortunately an empty image is created when I access the [clampFilter outputImage]. If I only perform the blur an image is produced, but with the dark inset border. CIImage *inputImage = [[CIImage alloc] initWithCGImage:self.CGImage]; CIContext *context = [CIContext contextWithOptions:nil]; CGAffineTransform transform =

Using CoreImage to filter an image results in image rotation

放肆的年华 提交于 2019-11-28 09:23:17
Thanks for reading. I'm trying to use CoreImage in iOS 5 to alter the appearance of an image. The problem is that the existing image appears to lose its orientation information during the process and ends up rotated by 90 degrees. Here is the code: CIContext *context = [CIContext contextWithOptions:nil]; CIImage *img = [CIImage imageWithCGImage:imageView.image.CGImage]; CIFilter *adjustFilter = [CIFilter filterWithName:@"CIHighlightShadowAdjust"]; [adjustFilter setDefaults]; [adjustFilter setValue:img forKey:kCIInputImageKey]; [adjustFilter setValue:[NSNumber numberWithFloat:0.3] forKey:@

SKEffectNode combined with CIFilter runs out of memory

爷,独闯天下 提交于 2019-11-28 06:09:42
问题 I tried to combine a SKEffectNode with a CIFilter and a child SKSpriteNode and while its seems to work for a few moments, the result is that all device memory is consumed and my iPad Retina (A7 GPU) just reboots. I also sometime see "Message from debugger: Terminated due to memory issue" printed to the debugger log. The full source is on github at SKEffectNodeFiltered. I am creating the filter like so: // Pixelate CoreImage filter CIFilter *pixellateFilter = [CIFilter filterWithName:@