core-image

Render dynamic text onto CVPixelBufferRef while recording video

北战南征 提交于 2019-12-21 01:12:30
问题 I'm recording video and audio using AVCaptureVideoDataOutput and AVCaptureAudioDataOutput and in the captureOutput:didOutputSampleBuffer:fromConnection: delegate method, I want to draw text onto each individual sample buffer I'm receiving from the video connection. The text changes with about every frame (it's a stopwatch label) and I want that to be recorded on top of the video data that's captured. Here's what I've been able to come up with so far: //1. CVPixelBufferRef pixelBuffer =

Adaptive Threshold CIKernel/CIFilter iOS

試著忘記壹切 提交于 2019-12-21 01:06:32
问题 I have researched all over in order to find a kernel that performs adaptive thresholding on iOS. Unfortunately I do not understand the kernel language or the logic behind it. Below, I have found a routine that performs thresholding (https://gist.github.com/xhruso00/a3f8a9c8ae7e33b8b23d) static NSString * const kKernelSource = @"kernel vec4 thresholdKernel(sampler image)\n" "{\n" " float inputThreshold = 0.05;\n" " float pass = 1.0;\n" " float fail = 0.0;\n" " const vec4 vec_Y = vec4( 0.299, 0

Using CIFilter with AVFoundation (iOS)

放肆的年华 提交于 2019-12-20 10:09:50
问题 I am trying to apply filters to a video composition created with AVFoundation on iOS (filters could be, eg, blur, pixelate, sepia, etc). I need to both apply the effects in real-time and be able to render the composite video out to disk, but I'm happy to start with just one or the other. Unfortunately, I can't seem to figure this one out. Here's what I can do: I can add a layer for animation to the UIView that's playing the movie, but it's not clear to me if I can process the incoming video

Tinting a grayscale NSImage (or CIImage)

…衆ロ難τιáo~ 提交于 2019-12-20 08:18:16
问题 I have a grayscale image which I want to use for drawing Cocoa controls. The image has various levels of gray. Where it is darkest, I want it to draw a specified tint color darkest. I want it to be transparent where the source image is white. Basically, I want to reproduce the behavior of tintColor seen in UINavigationBar on the iPhone. So far, I have explored several options: Draw the tint color over the grayscale image using SourceOver composition -> This requires a non-opaque tint color ->

How change kCIInputBrightnessKey using slider value in Swift, I'm getting either white or black picture whatever value the slider get

旧街凉风 提交于 2019-12-20 04:57:28
问题 The minimum value of the slider is -1 the maximum value is +1. @IBAction func changeContrast(_ sender: UISlider) { DispatchQueue.main.async { let beginImage = CIImage(image: self.myImageView.image!) self.filter = CIFilter(name: "CIColorControls") self.filter?.setValue(beginImage, forKey: kCIInputImageKey) self.filter.setValue(sender.value, forKey: kCIInputBrightnessKey) print("Current value of the slider - \(sender.value)") self.filteredImage = self.filter?.outputImage self.myImageView.image

Apply visual effect to images pixel by pixel in Swift

倾然丶 夕夏残阳落幕 提交于 2019-12-20 00:22:52
问题 I have an university's assignment to create visual effect and apply them to video frames captured through the devices camera. I currently can get the image and display but can't change the pixel color values. I transform the sample buffer to the imageRef variable and if I transform it to UIImage everything is alright. But now I want to take that imageRef an change its color's values pixel by pixel, in this example change to negative colors (I have to do more complicated stuff so I can't use

Apply visual effect to images pixel by pixel in Swift

旧时模样 提交于 2019-12-20 00:22:02
问题 I have an university's assignment to create visual effect and apply them to video frames captured through the devices camera. I currently can get the image and display but can't change the pixel color values. I transform the sample buffer to the imageRef variable and if I transform it to UIImage everything is alright. But now I want to take that imageRef an change its color's values pixel by pixel, in this example change to negative colors (I have to do more complicated stuff so I can't use

how to perform Bump Distortion in ios 5.0?

落爺英雄遲暮 提交于 2019-12-19 11:48:06
问题 i need to perform Bump Distortion in ios 5.0 ... my xcode doesn't show any error and also i am not get any output ... while trace and print the Bump filter instance it prints the null value... any idea about that... some of the post shows that was not work in ios 5.0, any other way is there to perform the Bump Distortion... Thanks in advance.... Regards, Spynet My code... context = [CIContext contextWithOptions:nil]; CIFilter *bumpDistortion = [CIFilter filterWithName:@"CIBumpDistortion"];

Flip NSImage on both axes

喜你入骨 提交于 2019-12-19 07:26:25
问题 I'm trying to flip an NSImage created with a NSImageBitmapRep representation. After some digging (Flipping Quicktime preview & capture and Mirroring CIImage/NSImage) I tried two ways via a CIImage and applying a scaling transformation with -1 for both factors. First using CIImage imageByApplyingTransform: NSBitmapImageRep *imgRep = ... CGImageRef cgi = [imgRep CGImage]; CIImage *cii = [CIImage imageWithCGImage:cgi]; CGAffineTransform at = CGAffineTransformTranslate(CGAffineTransformMakeScale(

Key differences between Core Image and GPUImage

坚强是说给别人听的谎言 提交于 2019-12-18 10:35:48
问题 What are the major differences between the Core Image and GPUImage frameworks (besides GPUImage being open source)? At a glance their interfaces seem pretty similar... Applying a series of filters to an input to create an output. I see a few small differences, such as the easy to use LookupFilter that GPUImage has. I am trying to figure out why someone would choose one over the other for a photo filtering application. 回答1: As the author of GPUImage, you may want to take what I say with a