core-image

Applying filter to real time camera preview - Swift

孤街浪徒 提交于 2019-12-04 14:05:56
问题 I'm trying to follow the answer given here: https://stackoverflow.com/a/32381052/8422218 to create an app which uses the back facing camera and adds a filter, then displays it on the screen in real time here is my code: // // ViewController.swift // CameraFilter // import UIKit import AVFoundation class ViewController: UIViewController, AVCaptureVideoDataOutputSampleBufferDelegate { var captureSession = AVCaptureSession() var backCamera: AVCaptureDevice? var frontCamera: AVCaptureDevice? var

Making CIContext.render(CIImage, CVPixelBuffer) work with AVAssetWriter

五迷三道 提交于 2019-12-04 12:44:10
I want to use Core Image for processing a bunch of CGImage objects and turning them into a QuickTime movie on macOS . The following code demonstrates what's needed, but the output contains a lot of blank (black) frames : import AppKit import AVFoundation import CoreGraphics import Foundation import CoreVideo import Metal // Video output url. let url: URL = try! FileManager.default.url(for: .downloadsDirectory, in: .userDomainMask, appropriateFor: nil, create: false).appendingPathComponent("av.mov") try? FileManager.default.removeItem(at: url) // Video frame size, total frame count, frame rate

CIDetector won't release memory - swift

扶醉桌前 提交于 2019-12-04 12:43:02
After the face detection is done the memory will not release, is there is a way I could release it (the memory stay at 300MB after the process is done). autoreleasepool{ manager.requestImageData(for: asset, options: option){ (data, responseString, imageOriet, info) in if (data != nil){ //let faces = (faceDetector?.features(in: CIImage(data: data!)!)) guard let faces = self.faceDetector?.features(in: CIImage(data: data!)!) else { return } completionHandler((faces.count)) }else{ print(info) } } } 来源: https://stackoverflow.com/questions/39869331/cidetector-wont-release-memory-swift

How to create simple custom filter for iOS using Core Image Framework?

a 夏天 提交于 2019-12-04 12:25:27
问题 I want to use in my app an custom filter. Now I know that I need to use Core Image framework, but i not sure that is right way. Core Image framework uses for Mac OS and in iOS 5.0 - I'm not sure that could be used for custom CIFilter effects. Can you help me with this issues? Thanks all! 回答1: OUTDATED You can't create your own custom kernels/filters in iOS yet. See http://developer.apple.com/library/mac/#documentation/graphicsimaging/Conceptual/CoreImaging/ci_intro/ci_intro.html, specifically

How to free memory in ARC for high memory usage graphics render?

泪湿孤枕 提交于 2019-12-04 11:25:23
First off, thank you to everyone on this site...it's been INCREDIBLY helpful in getting into the grit of iOS programming. My current issue: I have an app that renders a very stylized version of a photo. It uses some CoreImage filters for some of it, but needs a bunch of CoreGraphics to get the heavy image processing done. The proxy size renders work out great, but when I render a full resolution version of my image, it sometimes crashes because of high memory usage. The problem is that I need to be able to have several full resolution (3264x2448) buffers in memory when rendering. I don't know

BSXPCMessage received error for message: Connection interrupted on CIContext with iOS 8

萝らか妹 提交于 2019-12-04 11:25:01
I have got some problems on my app right now. I would like to create a CIContext with : CIContext *myContext = [CIContext contextWithOptions:nil]; But when starting the app, this line return the following message in console : "BSXPCMessage received error for message: Connection interrupted" This message come when I launch the app on iOS 8 (simulator or device), but not with an iOS 7 simulator (I don't have a device to try). I tried many things to solve this like try it in another projet, on another Mac, call this method on another file... I think it come from iOS 8. It don't look to change my

Horizontal Flip of a frame in Objective-C

时间秒杀一切 提交于 2019-12-04 10:36:50
I am trying to create a filter for my program (which streams a webcam) which makes the frame flip horizontally, making the webcam act like a mirror. However, while it compiles and runs, the filter does not seem to have any effect on it. Here is the code: CIImage *resultImage = image; CIFilter *flipFilter = [CIFilter filterWithName:@"CIAffineTransform"]; [flipFilter setValue:resultImage forKey:@"inputTransform"]; NSAffineTransform* flipTransform = [NSAffineTransform transform]; [flipTransform scaleXBy:-1.0 yBy:1.0]; //horizontal flip [flipFilter setValue:flipTransform forKey:@"inputTransform"];

Multiple CIFilters in one CIImage?

自作多情 提交于 2019-12-04 10:29:08
I have two CIFilters, exposure and hue. I need to combine the filters over one UIImage. How should i go about this? Below is some code that i have so far... CIFilter *hueFilter; CIFilter *exposureFilter; CIImage *adjustedImage; hueFilter = [CIFilter filterWithName:@"CIHueAdjust"]; exposureFilter = [CIFilter filterWithName:@"CIExposureAdjust"]; [hueFilter setValue:[NSNumber numberWithFloat:5] forKey: @"inputAngle"]; [exposureFilter setValue:[NSNumber numberWithFloat:5] forKey: @"inputEV"]; adjustedImage = [CIImage imageWithCGImage:inputCGImage]; [hueFilter setValue:adjustedImage forKey:@

Xcode: compositing with alpha using core image

丶灬走出姿态 提交于 2019-12-04 09:53:28
问题 I'd like to create a CoreImage filter chain, and to be able to control the "intensity" of each filter in the chain by compositing its individual effect with alpha, or opacity settings, but I am not seeing a way to composite with alpha or opacity in the docs. I could jump out of Core image filter chain and composite with a core graphics context I guess. 回答1: The CIColorMatrix filter can be used to alter the alpha component of a CIImage, which you can then composite onto a background image:

CIAreaHistogram gives me all 0 except the last element?

ε祈祈猫儿з 提交于 2019-12-04 09:52:37
I want to calculate histogram of an NSImage, so I turned to CIFilter naturally. There's a filter named CIAreaHistogram does what I want. Here's my code: NSBitmapImageRep *rep = [image bitmapImageRepresentation]; CIImage* hImage = nil; @autoreleasepool { CIImage *input = [[CIImage alloc] initWithBitmapImageRep:rep]; CIFilter *histogramFilter = [CIFilter filterWithName:@"CIAreaHistogram"]; [histogramFilter setDefaults]; [histogramFilter setValue:input forKey:kCIInputImageKey]; [histogramFilter setValue:[CIVector vectorWithCGRect:[input extent]] forKeyPath:@"inputExtent"]; [histogramFilter