core-image

Make an UIImage from a CMSampleBuffer

限于喜欢 提交于 2019-12-29 03:21:08
问题 This is not the same as the countless questions about converting a CMSampleBuffer to a UIImage . I'm simply wondering why I can't convert it like this: CVPixelBufferRef pixelBuffer = (CVPixelBufferRef)CMSampleBufferGetImageBuffer(sampleBuffer); CIImage * imageFromCoreImageLibrary = [CIImage imageWithCVPixelBuffer: pixelBuffer]; UIImage * imageForUI = [UIImage imageWithCIImage: imageFromCoreImageLibrary]; It seems a lot simpler because it works for YCbCr color spaces, as well as RGBA and

How to apply Cool and warm tone CIFilter in iOS?

人盡茶涼 提交于 2019-12-25 19:57:41
问题 I am working on Video based Application in Swift iOS . Where I am using AVPlayer to play the Video and setting CIFilters on Video using AVVideoComposition . I have to apply Cool and warm tone filter effects on my Videos (please see the below images). I have tried all the Core Image Filters mentioned in below Apple doc , but none of them looks like Cool and warm tone filter effect . https://developer.apple.com/library/archive/documentation/GraphicsImaging/Reference/CoreImageFilterReference

iOS: How to efficiently blur an image?

被刻印的时光 ゝ 提交于 2019-12-25 03:22:38
问题 I have an image that is stored a RGBA format in memory. I've written a blur routine that works fine for small blur radii, however big ones like 16 point take forever. Is there an efficient way using CoreImage etc to blur an image? Will using it cause any loss in image quality? 回答1: Have you tried the built-in CIGaussianBlur? One thing you could do to improve your performance is the following: Use the CILanczosScaleTransform filter to scale-down your image. Perform a low-radius blur using

Swift 4, Subclassing CIFilter crashes only with “input” instance variables

拜拜、爱过 提交于 2019-12-24 12:11:39
问题 How do you subclass CIFilter now? In Swift 3 I could do this as a simple example: class CustomFilter: CIFilter { var inputImage: CIImage? var inputOrigin: CIVector? var inputAnotherVar: String? } But in Swift 4 I get an NSException. If I remove "input" from each variable it works fine. I could just do that. But I feel like I'm missing something important and I can't seem to find anything explaining this behaviour. This compiles fine in Swift4 : class CustomFilter: CIFilter { var image:

Using CIEdgeWork Filters in iOS

本小妞迷上赌 提交于 2019-12-24 07:17:38
问题 I am using Core Image filters and trying to make the CIEdgeWork filter. When the filter is applied the image turns black. Am I initializing the CIFilter correctly. CIFilter *edgeWork = [CIFilter filterWithName:@"CIEdgeWork" keysAndValues:kCIInputImageKey,filterPreviewImage, @"inputRadius",[NSNumber numberWithFloat:3.0], nil]; 回答1: CIEdgeWork is not available in Core Image on iOS as of iOS 5.x, so it's no surprise that you're seeing a black image when trying to use it. However, you can use the

CoreImage very high memory usage

一笑奈何 提交于 2019-12-24 02:23:42
问题 When working with CIImage my memory jumps quite drastically by 50MB-60MB when creating the CIIMage and then my program crashes when I'm trying to create the UIImage result. CIImage *image = [CIImage imageWithData:data]; // +50MB memory increase (data is jpeg data) It doesn't happen all the time but it happens more frequently for larger images (3264x2448) and while the app is in background mode. It leads to frequent crashes. Any ideas? 回答1: Core Image has severe limits on image size. iOS 8

Applying CIFiler to a masked portion of an image

别来无恙 提交于 2019-12-23 22:46:00
问题 I'm looking for a way to apply a CIFilter to a portion of an image defined by a mask. Something like this, given an image: Source http://imageshack.us/scaled/landing/213/browserpreviewtmp1p.jpg And a mask: I apply some variation of CIFilter *filter = [CIFilter filterWithName:@"CIColorControls" keysAndValues: kCIInputImageKey, beginImage,@"inputSaturation", @0, @"inputContrast", @1, @"inputBrightness", @0, nil]; And get this: Result http://imageshack.us/a/img689/5297/browserpreviewtmpd.jpg How

How do you run Core Animation and Core Image together on the CPU?

给你一囗甜甜゛ 提交于 2019-12-23 22:25:58
问题 Apple's Core Image Programming Guide, under the section "Getting the Best Performance" says Avoid Core Animation animations while rendering CIImage objects with a GPU context. If you need to use both simultaneously, you can set up both to use the CPU. Can anyone explain this statement? Why it would be more efficient to run Core Animation and Core Image together on the CPU, rather than using the GPU? How do you set up Core Animation to run on the CPU? 回答1: I had the same question and came

How do you add a CIPixellate Core Image Filter to a Sprite Kit scene?

北城以北 提交于 2019-12-23 10:27:48
问题 How do you add a CIPixellate Core Image Filter to a Sprite Kit scene? I have a SpriteKit scene that is an SKScene or subclass of it. I want to add a Core Image filter to the scene. Specifically a CIPixellate filter, so I can have 8-bit game heaven for free. How do I do that? 回答1: It turns out this is not hard at all. It's just that the Core Image Filter docs are OLD and crufty and in the case of SpriteKit , the docs are flat out misleading or incomplete, including the SKEffectNode docs. The

Caricature in iOS [closed]

隐身守侯 提交于 2019-12-23 06:34:08
问题 It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center. Closed 6 years ago . I am trying to apply caricature effect to photos in iOS. I googled for so many things but found very things for it. I have checked https://github.com/BradLarson/GPUImage for to get sketch of an image so that I