core-image

“Performing a costly unpadding operation!” — what is it, and how to fix it?

℡╲_俬逩灬. 提交于 2019-12-10 05:21:20
问题 The debug console for my Core Filters test application is showing this message: CGImageRef 0x7a0e890 has row byte padding. Performing a costly unpadding operation! I couldn't find a hit for that exact message (minus the pointer info) in the headers or in a Google search. My questions are (1)what does that mean and (2)how can I rectify the situation? The following is an example of how I am generating a filtered UIImage using a CIFilter. - (UIImage*)sepia { CIImage *beginImage = [CIImage

Core Image filter CISourceOverCompositing not appearing as expected with alpha overlay

依然范特西╮ 提交于 2019-12-10 04:19:32
问题 I’m using CISourceOverCompositing to overlay text on top of an image and I’m getting unexpected results when the text image is not fully opaque. Dark colors are not dark enough and light colors are too light in the output image. I recreated the issue in a simple Xcode project. It creates an image with orange, white, black text drawn with 0.3 alpha, and that looks correct. I even threw that image into Sketch placing it on top of the background image and it looks great. The image at the bottom

Using transition CIFilters for CATransition

China☆狼群 提交于 2019-12-10 03:39:50
问题 I'm trying to use CATransition 's filter property with the new iOS 6 transition animations (CIBarsSwipeTransition, CICopyMachineTransition, etc.). The CIFilter documentation says that these are available on iOS 6, and nothing on the CATransition documentation says that the filter property cannot be used. But, I can't seem to get them to work. I don't know if Apple just failed to mention the inavailability of the functionality or I'm just missing something to make it work. Here's how I set it

Creating UIImage from CIImage

末鹿安然 提交于 2019-12-10 02:19:23
问题 I am using some CoreImage filters to process an image. Applying the filter to my input image results in an output image called filterOutputImage of type CIImage. I now wish to display that image, and tried doing: self.modifiedPhoto = [UIImage imageWithCIImage:filterOutputImage]; self.photoImageView.image = self.modifiedPhoto; The view however is blank - nothing is being displayed. If I add logging statements that print out details about both filterOutputImage and self.modifiedPhoto, those

CITemperatureAndTint for image in iOS

六眼飞鱼酱① 提交于 2019-12-09 15:46:51
问题 Is there any sample code or example for CITemperatureAndTint? I have read its documentation but i need some example to implement it. 回答1: CIFilter *yourFilter = [CIFilter filterWithName:@"CITemperatureAndTint"]; [yourFilter setValue:yourInputImage forKey:@"inputImage"]; [yourFilter setValue:[CIVector vectorWithX:6500 Y:500] forKey:@"inputNeutral"]; // Default value: [6500, 0] Identity: [6500, 0] [yourFilter setValue:[CIVector vectorWithX:1000 Y:630] forKey:@"inputTargetNeutral"]; // Default

setting UIImageView content mode after applying a CIFIlter

陌路散爱 提交于 2019-12-09 14:41:00
问题 Thanks for looking. Here's my code CIImage *result = _vignette.outputImage; self.mainImageView.image = nil; //self.mainImageView.contentMode = UIViewContentModeScaleAspectFit; self.mainImageView.image = [UIImage imageWithCIImage:result]; self.mainImageView.contentMode = UIViewContentModeScaleAspectFit; in here _vignette is correctly set up filter and image effect is applying to the image correctly. I'm using a source image with resolution 500x375. My imageView has almost iPhone screen's

How can I use CIFilter in iOS?

感情迁移 提交于 2019-12-09 11:57:41
问题 Apple says that CIFilter is available in iOS. However, on my mac I couldn't find an CoreImage framework to link against. filter An optional Core Image filter object that provides the transition. @property(retain) CIFilter *filter i.e. when I try to do something like this, it crashes because CIFilter is unknown: [transition setFilter:[CIFilter filterWithName:@"CIShapedWaterRipple"]]; I linked against: #import <UIKit/UIKit.h> #import <QuartzCore/QuartzCore.h> #import <CoreGraphics/CoreGraphics

iOS6 : How to use the conversion feature of YUV to RGB from cvPixelBufferref to CIImage

旧巷老猫 提交于 2019-12-09 10:03:40
问题 From iOS6, Apple has given the provision to use native YUV to CIImage through this call initWithCVPixelBuffer:options: In the core Image Programming guide, they have mentioned about this feature Take advantage of the support for YUV image in iOS 6.0 and later. Camera pixel buffers are natively YUV but most image processing algorithms expect RBGA data. There is a cost to converting between the two. Core Image supports reading YUB from CVPixelBuffer objects and applying the appropriate color

iOS face detector orientation and setting of CIImage orientation

好久不见. 提交于 2019-12-09 07:00:14
问题 EDIT found this code that helped with front camera images http://blog.logichigh.com/2008/06/05/uiimage-fix/ Hope others have had a similar issue and can help me out. Haven't found a solution yet. (It may seem a bit long but just a bunch of helper code) I'm using the ios face detector on images aquired from the camera (front and back) as well as images from the gallery (I'm using the UIImagePicker - for both image capture by camera and image selection from the gallery - not using avfoundation

Core Image CIColorControls brightness filter creates wrong effect. How do I change my image's luminance?

偶尔善良 提交于 2019-12-08 22:50:58
问题 I'm creating a color picker for iOS. I would like to enable the user to select the brightness (luminance) and have the color wheel reflect this change. I'm using Core Image to modify the brightness with the CIColorControls filter. Here's my code: -(CIImage *)oldPhoto:(CIImage *)img withBrightness:(float)intensity { CIFilter *lighten = [CIFilter filterWithName:@"CIColorControls"]; [lighten setValue:img forKey:kCIInputImageKey]; [lighten setValue:@((intensity * 2.0) - 1.0) forKey:@