core-image

How to free memory in ARC for high memory usage graphics render?

落花浮王杯 提交于 2019-12-06 05:50:05
问题 First off, thank you to everyone on this site...it's been INCREDIBLY helpful in getting into the grit of iOS programming. My current issue: I have an app that renders a very stylized version of a photo. It uses some CoreImage filters for some of it, but needs a bunch of CoreGraphics to get the heavy image processing done. The proxy size renders work out great, but when I render a full resolution version of my image, it sometimes crashes because of high memory usage. The problem is that I need

Color Specific Hue/Saturation from Photoshop to iOS

血红的双手。 提交于 2019-12-06 05:15:00
I'm trying to use GPUImage and CIFilter to map this filter. Please note, I need help mapping the color (Reds) specific (note: NOT Master, just Reds) photoshop element to iOS. Does anyone know how to manipulate a CIFilter or GPUImage class to get the photoshop effect below in iOS? You could use GPUImage with the lookup filter: GPUImageLookupFilter: Uses an RGB color lookup image to remap the colors in an image. First, use your favourite photo editing application to apply a filter to lookup.png from GPUImage/framework/Resources. For this to work properly each pixel color must not depend on other

Multiple CIFilters in one CIImage?

那年仲夏 提交于 2019-12-06 05:07:47
问题 I have two CIFilters, exposure and hue. I need to combine the filters over one UIImage. How should i go about this? Below is some code that i have so far... CIFilter *hueFilter; CIFilter *exposureFilter; CIImage *adjustedImage; hueFilter = [CIFilter filterWithName:@"CIHueAdjust"]; exposureFilter = [CIFilter filterWithName:@"CIExposureAdjust"]; [hueFilter setValue:[NSNumber numberWithFloat:5] forKey: @"inputAngle"]; [exposureFilter setValue:[NSNumber numberWithFloat:5] forKey: @"inputEV"];

How to use a CIFilter on the layerClass instance of an UIView?

ⅰ亾dé卋堺 提交于 2019-12-06 04:23:23
问题 My UIView is using an instance of TBPaperLayer for its layer. +(Class)layerClass { return [TBPaperLayer class]; } I would like to create a CIFilter to modify the appearance of this layer - especially apply a blur filter on it. How can I use this code to blur a part of this layer ? (code from: Blur CALayer's Superlayer) CALayer *blurLayer = [CALayer layer]; CIFilter *blur = [CIFilter filterWithName:@"CIGaussianBlur"]; [blur setDefaults]; blurLayer.backgroundFilters = [NSArray arrayWithObject

iOS - Cannot process image using CIFilter

强颜欢笑 提交于 2019-12-06 04:18:45
问题 I am trying to process image using Core Image. I have created UIImage category to do it. I have added QuartzCore and CoreImage frameworks to project, imported CoreImage/CoreImage.h and used this code: CIImage *inputImage = self.CIImage; CIFilter *exposureAdjustmentFilter = [CIFilter filterWithName:@"CIExposureAdjust"]; [exposureAdjustmentFilter setDefaults]; [exposureAdjustmentFilter setValue:inputImage forKey:@"inputImage"]; [exposureAdjustmentFilter setValue:[NSNumber numberWithFloat:5.0f]

How do I draw onto a CVPixelBufferRef that is planar/ycbcr/420f/yuv/NV12/not rgb?

空扰寡人 提交于 2019-12-06 04:15:10
问题 I have received a CMSampleBufferRef from a system API that contains CVPixelBufferRef s that are not RGBA (linear pixels). The buffer contains planar pixels (such as 420f aka kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange aka yCbCr aka YUV ). I would like to modify do some manipulation of this video data before sending it off to VideoToolkit to be encoded to h264 (drawing some text, overlaying a logo, rotating the image, etc), but I'd like for it to be efficient and real-time. Buuuut planar

How to calculate an image has noise and Geometric distortion or not?

南楼画角 提交于 2019-12-06 03:29:29
问题 I need to make an application in iphone which needs to calculate noise, geometric deformation other distortions in an image. How to do this? I have done some image processing stuff with opencv + iphone. But I dont know how to calculate these parameters. 1) How to calculate noise in an image? 2) What is geometric deformation and how to calculate geometric deformation of an image? 3) Is geometric deformation and distortion are same parameters in terms of image filter? or any other distortions

CVPixelBuffer to CIImage always returning nil

那年仲夏 提交于 2019-12-06 02:17:29
问题 I am trying to convert a pixelBuffer extracted from AVPlayerItemVideoOutput to CIImage but always getting nil. The Code if([videoOutput_ hasNewPixelBufferForItemTime:player_.internalPlayer.currentItem.currentTime]) { CVPixelBufferRef pixelBuffer = [videoOutput_ copyPixelBufferForItemTime:player_.internalPlayer.currentItem.currentTime itemTimeForDisplay:nil]; CIImage *image = [CIImage imageWithCVPixelBuffer:pixelBuffer]; // Always image === nil CIFilter *filter = [FilterCollection

iOS 8 Core Image saving a section of an Image via Swift

落爺英雄遲暮 提交于 2019-12-06 02:10:40
问题 I'm using CoreImage Framework for detecting Business Card. When I detect a rectangle (CIDetectorTypeRectangle) I draw an overlay using this method: func drawOverlay(image: CIImage, topLeft: CGPoint, topRight: CGPoint, bottomLeft: CGPoint, bottomRight: CGPoint) -> CIImage { var overlay = CIImage(color: CIColor(red: 0, green: 0, blue: 1.0, alpha: 0.3)) overlay = overlay.imageByCroppingToRect(image.extent()) overlay = overlay.imageByApplyingFilter("CIPerspectiveTransformWithExtent",

iOS UIImage being rotated when after going through CIFilter

只愿长相守 提交于 2019-12-06 02:07:47
I'm working with filtering images that I'm taking with the camera. I pass the image I get from the camera through the below method. Which I have the returned UIImage sent to a UIImageView . For some reason when it passes through this method the image is getting rotated. What am I doing wrong? - (UIImage *) applyFilterToImage:(UIImage *)image withFilter:(NSString *)filterName { beginImage = [[[CIImage alloc] initWithImage:image] autorelease]; context = [CIContext contextWithOptions:nil]; filter = [CIFilter filterWithName:@"CISepiaTone" keysAndValues:kCIInputImageKey, beginImage, @