core-image

How to calculate an image has noise and Geometric distortion or not?

送分小仙女□ 提交于 2019-12-04 09:42:59
I need to make an application in iphone which needs to calculate noise, geometric deformation other distortions in an image. How to do this? I have done some image processing stuff with opencv + iphone. But I dont know how to calculate these parameters. 1) How to calculate noise in an image? 2) What is geometric deformation and how to calculate geometric deformation of an image? 3) Is geometric deformation and distortion are same parameters in terms of image filter? or any other distortions available to calculate an image is good quality or not? Input: My image is a face image in live video

Memory usage keeps rising on older devices using Metal

徘徊边缘 提交于 2019-12-04 08:23:16
I use Metal and CADisplayLink to live filter a CIImage and render it into a MTKView . // Starting display link displayLink = CADisplayLink(target: self, selector: #selector(applyAnimatedFilter)) displayLink.preferredFramesPerSecond = 30 displayLink.add(to: .current, forMode: .default) @objc func applyAnimatedFilter() { ... metalView.image = filter.applyFilter(image: ciImage) } According to the memory monitor in Xcode, memory usage is stable on iPhone X and never goes above 100mb, on devices like iPhone 6 or iPhone 6s the memory usage keeps growing until eventually the system kills the app. I

iOS - Cannot process image using CIFilter

孤街醉人 提交于 2019-12-04 07:38:58
I am trying to process image using Core Image. I have created UIImage category to do it. I have added QuartzCore and CoreImage frameworks to project, imported CoreImage/CoreImage.h and used this code: CIImage *inputImage = self.CIImage; CIFilter *exposureAdjustmentFilter = [CIFilter filterWithName:@"CIExposureAdjust"]; [exposureAdjustmentFilter setDefaults]; [exposureAdjustmentFilter setValue:inputImage forKey:@"inputImage"]; [exposureAdjustmentFilter setValue:[NSNumber numberWithFloat:5.0f] forKey:@"inputEV"]; CIImage *outputImage = [exposureAdjustmentFilter valueForKey:@"outputImage"];

CVPixelBuffer to CIImage always returning nil

我们两清 提交于 2019-12-04 05:15:23
I am trying to convert a pixelBuffer extracted from AVPlayerItemVideoOutput to CIImage but always getting nil. The Code if([videoOutput_ hasNewPixelBufferForItemTime:player_.internalPlayer.currentItem.currentTime]) { CVPixelBufferRef pixelBuffer = [videoOutput_ copyPixelBufferForItemTime:player_.internalPlayer.currentItem.currentTime itemTimeForDisplay:nil]; CIImage *image = [CIImage imageWithCVPixelBuffer:pixelBuffer]; // Always image === nil CIFilter *filter = [FilterCollection filterSepiaForImage:image]; image = filter.outputImage; CIContext *context = [CIContext contextWithOptions:nil];

How can I fix a Core Image's CILanczosScaleTransform filter border artifact?

时光毁灭记忆、已成空白 提交于 2019-12-04 05:06:57
问题 I want to implement an image downscaling algorithm for iOS. After reading that Core Images's CILanczosScaleTransform was a great fit for it, I implemented it the following way: public func resizeImage(_ image: UIImage, targetWidth: CGFloat) -> UIImage? { assert(targetWidth > 0.0) let scale = Double(targetWidth) / Double(image.size.width) guard let ciImage = CIImage(image: image) else { fatalError("Couldn't create CIImage from image in input") } guard let filter = CIFilter(name:

“unrecognized selector” when attempting to access CIFilter's outputImage

ぃ、小莉子 提交于 2019-12-04 03:58:14
问题 I'm experimenting with Core Image (on OS X, 10.7.3) for the first time and am running into a brick wall. I'm certain this is something silly I'm doing and just need someone more familiar with the framework to point it out to me. Consider the following code (let's stipulate that imageURL is a valid file URL pointing to a JPG on disk): CIImage *inputImage = [CIImage imageWithContentsOfURL:imageURL]; CIFilter *filter = [CIFilter filterWithName:@"CIAreaAverage" keysAndValues: kCIInputImageKey,

CITemperatureAndTint for image in iOS

冷暖自知 提交于 2019-12-04 03:02:27
Is there any sample code or example for CITemperatureAndTint? I have read its documentation but i need some example to implement it. Bala CIFilter *yourFilter = [CIFilter filterWithName:@"CITemperatureAndTint"]; [yourFilter setValue:yourInputImage forKey:@"inputImage"]; [yourFilter setValue:[CIVector vectorWithX:6500 Y:500] forKey:@"inputNeutral"]; // Default value: [6500, 0] Identity: [6500, 0] [yourFilter setValue:[CIVector vectorWithX:1000 Y:630] forKey:@"inputTargetNeutral"]; // Default value: [6500, 0] Identity: [6500, 0] CIImage *resultImage = [yourFilter valueForKey: @"outputImage"];

setting UIImageView content mode after applying a CIFIlter

拈花ヽ惹草 提交于 2019-12-04 01:42:52
Thanks for looking. Here's my code CIImage *result = _vignette.outputImage; self.mainImageView.image = nil; //self.mainImageView.contentMode = UIViewContentModeScaleAspectFit; self.mainImageView.image = [UIImage imageWithCIImage:result]; self.mainImageView.contentMode = UIViewContentModeScaleAspectFit; in here _vignette is correctly set up filter and image effect is applying to the image correctly. I'm using a source image with resolution 500x375. My imageView has almost iPhone screen's resolution. So to avoid stretching I'm using AspectFit. But after applying effect when I'm assigning the

Using a CIImage from CIColor in a CIFilter: getting empty image

不想你离开。 提交于 2019-12-03 19:40:34
问题 I'm trying to create a CIFilter with blend mode (like overlay or multiply). Relevant code: // Let's try a filter here // Get the data NSData *imageData = UIImageJPEGRepresentation(image, 0.85); // Create a CI Image CIImage *beginImage = [CIImage imageWithData:imageData]; CIImage *overlay = [CIImage imageWithColor:[CIColor colorWithRed:0.7 green:0.75 blue:0.9 alpha:0.75]]; // Create a context CIContext *context = [CIContext contextWithOptions:nil]; // Create filter CIFilter *filter = [CIFilter

Recording videos with real-time filters in Swift

随声附和 提交于 2019-12-03 16:34:36
I am new to swift and trying to build a camera app which can apply real-time filters, and save with the applied filters. So far i can preview real-time with the applied filters, but when i save the video its all black. import UIKit import AVFoundation import AssetsLibrary import CoreMedia import Photos class ViewController: UIViewController , AVCaptureVideoDataOutputSampleBufferDelegate { var captureSession: AVCaptureSession! @IBOutlet weak var previewView: UIView! @IBOutlet weak var recordButtton: UIButton! @IBOutlet weak var imageView: UIImageView! var assetWriter: AVAssetWriter? var