core-image

Unable to Decrease CIVignette effect using UISlider

六月ゝ 毕业季﹏ 提交于 2019-12-06 14:40:10
I have used CIVignette effect for My Photo Editing app. It applying based on Slider's Changing event. I have used following code and my issue is whenever I increased slider's value Vignette effect is appeared but whenever I decrease slider's value the Vignette effect does not decrease. Please Help. @IBAction func slider(_ sender: UISlider) { let startImage = CIImage(image: imgEdited!)! let vignetteFilter = CIFilter(name: "CIVignette")! let radius = 5 vignetteFilter.setValue(startImage, forKey:kCIInputImageKey) vignetteFilter.setValue(sender.value, forKey:"inputIntensity") vignetteFilter

Color Balance with Core Image

孤街醉人 提交于 2019-12-06 14:06:40
问题 I'm trying to recreate a "filter" from Photoshop with Core Image. I got the easier stuff like exposure, vibrance, tone curve down, but not sure how to replicate a color balance with shadows, midtones, highlights. I've tried CIColorMatrix but it doesn't have the same effect on adjusting the colors of the respective shadows/midtone/highlights. CIHighlightShadowAdjust also does not create the same color effect as the Photoshop color balance. What can I use to replicate this Photoshop color

CIImage drawing EXC_BAD_ACCESS

北城以北 提交于 2019-12-06 13:42:45
So, I have a CIImage that I'm attempting to draw in an NSView 's -drawRect method. This is the line of code that I call to draw the image: [outputCoreImage drawInRect: [self bounds] fromRect: originalBounds operation: NSCompositeSourceOver fraction: 1]; outputCoreImage , originalBounds , and [self bounds] are all non- nil , and indeed are their respective expected values. On Lion (OS X 10.7), this worked fine, however on Mountain Lion (OS X 10.8) I receive an EXC_BAD_ACCESS on this line. If I walk up the stack, I find that the internal function call that breaks is on CGLGetPixelFormat . frame

Convenience initialiser of CIFilter is giving strange exception

感情迁移 提交于 2019-12-06 13:40:30
问题 Here is the code I am trying, typealias Parameters = Dictionary<String, AnyObject> extension CIFilter { convenience init(name: String, parameters: Parameters) { self.init(name: "CIGloom") setDefaults() for (key, value: AnyObject) in parameters { setValue(value, forKey: key) } } var outPutImage: CIImage { return self.valueForKey(kCIOutputImageKey) as CIImage } } The exception is occurring during self.init(name: "CIGloom") , I tried different filter name but the result is same. 2014-11-11 15:08

How to apply HSB color filters to UIImage

旧街凉风 提交于 2019-12-06 13:14:23
I've been struggling for a few days for a project on UIImage colorization. The idea is that the app will embark a set of images that I will have to colorize with values retrieved from a webservice. Some sort of themes if you wish. The designer I work with gave me a background image on all of his Photoshop values. The first problem is that Photoshop uses HSL and iOS uses HSB. So the first challenge was to translate the values from Photoshop. Photoshop HSL: -28 (range -180 => +180), 100 (range -100 => +100), 25 (range -100 => +100). Luckily I found some code online, here it is . //adapted from

How to detect hardware acceleration for Core Image?

旧巷老猫 提交于 2019-12-06 12:47:55
问题 In my app I'm using a "CIMotionBlur" CIFilter during CALayer animation. The problem is that the filter does not work properly when hardware acceleration is not available: In OS X Safe Mode the layer becomes invisible during animation; When using VMWare Fusion the animation is unbearably slow and makes testing the app harder; Animation works fine without the filter. I'd like to apply filter only when hardware acceleration is available. What's the highest level API that would let me know when

Core Image CIPerspectiveTransform Filter: How to use CIVectors?

こ雲淡風輕ζ 提交于 2019-12-06 12:47:54
问题 I am having a very hard time finding any documentation online that clearly explains how to implement Core Image's CIPerspectiveTransform filter properly. In particular, when setting CIVector values for inputTopLeft , inputTopRight , inputBottomRight , and inputBottomLeft , what are these vectors doing to the image? (I.e., what is the math behind how these vectors warp my image?) Currently this is the code I am using. It doesn't crash, but it doesn't show an image: CIImage *myCIImage = [

Making CIContext.render(CIImage, CVPixelBuffer) work with AVAssetWriter

為{幸葍}努か 提交于 2019-12-06 07:47:34
问题 I want to use Core Image for processing a bunch of CGImage objects and turning them into a QuickTime movie on macOS . The following code demonstrates what's needed, but the output contains a lot of blank (black) frames: import AppKit import AVFoundation import CoreGraphics import Foundation import CoreVideo import Metal // Video output url. let url: URL = try! FileManager.default.url(for: .downloadsDirectory, in: .userDomainMask, appropriateFor: nil, create: false).appendingPathComponent("av

Save image with the correct orientation - Swift & Core Image

ⅰ亾dé卋堺 提交于 2019-12-06 07:32:43
问题 I'm using Core Image in Swift for editing photos and I have a problem when I save the photo. I'm not saving it with correct orientation . When I get the picture from the Photo Library I'm saving the orientation in a variable as UIImageOrientation but I don't know how to set it back before saving the edited photo to the Photo Library. Any ideas how? Saving the orientation: var orientation: UIImageOrientation = .Up orientation = gotImage.imageOrientation Saving the edited photo to the Photo

CIAreaHistogram gives me all 0 except the last element?

☆樱花仙子☆ 提交于 2019-12-06 05:57:52
问题 I want to calculate histogram of an NSImage, so I turned to CIFilter naturally. There's a filter named CIAreaHistogram does what I want. Here's my code: NSBitmapImageRep *rep = [image bitmapImageRepresentation]; CIImage* hImage = nil; @autoreleasepool { CIImage *input = [[CIImage alloc] initWithBitmapImageRep:rep]; CIFilter *histogramFilter = [CIFilter filterWithName:@"CIAreaHistogram"]; [histogramFilter setDefaults]; [histogramFilter setValue:input forKey:kCIInputImageKey]; [histogramFilter