core-image

A valid image not being returned from CoreImage in MonoTouch for a valid Asset

独自空忆成欢 提交于 2019-12-14 03:59:27
问题 I am trying to get an image using this. img = new UIImage(new MonoTouch.CoreImage.CIImage(validAssetObject),1.0f, UIImageOrientation.Up); A CIImage is returned from the CIImage call. The new UIImage has the CIImage property = null . Seems like the constructor for UIImage is not working as expected? Any ideas from the MonoTouch community? 回答1: I wrote a quick test and this seems to work fine. string file = Path.Combine (NSBundle.MainBundle.ResourcePath, "image.png"); using (var url = NSUrl

BSXPCMessage received error for message: Connection interrupted on CIContext with iOS 8

倾然丶 夕夏残阳落幕 提交于 2019-12-14 03:42:41
问题 I have got some problems on my app right now. I would like to create a CIContext with : CIContext *myContext = [CIContext contextWithOptions:nil]; But when starting the app, this line return the following message in console : "BSXPCMessage received error for message: Connection interrupted" This message come when I launch the app on iOS 8 (simulator or device), but not with an iOS 7 simulator (I don't have a device to try). I tried many things to solve this like try it in another projet, on

CIImage filter is stretching my image vertically

|▌冷眼眸甩不掉的悲伤 提交于 2019-12-14 00:42:38
问题 I have an image which I'm using in a custom UITableViewCell . The image is a black triangle on a transparent background and I'm using a CIColorInvert filter to make it a white triangle on a transparent background. The filtered (inverted) image is being stretched vertically, how can I stop this from happening? If I load the UIImage directly without doing the filtering what I get is this: If I apply the CIFilter what I get is this: Here's my code: // create the white triangle CIImage

Center vector of CIFilter

戏子无情 提交于 2019-12-13 07:35:34
问题 I'm having a hard time understanding the meaning of the center vector used by many different CIFilters. Let's take CIRadialGradient for example, what does the inputCenter mean? What's its values range? How does its coordinate system behave etc. 回答1: As Brian said, it's just a type that encodes a position where the effect is centered. HOWEVER, CoreImage coordinates are inverted on the Y-axis from what you're used to using on iOS: The 0,0 point is the lower-left corner and the y value increases

How to auto draw an UIImage with frame and color?

一笑奈何 提交于 2019-12-13 04:24:58
问题 I try something to create an UIImage with a frame and color but don't know what the correct way. UIImage *aImage = [UIImage imageWithCIImage:[CIImage imageWithColor:[CIColor colorWithCGColor:[UIColor redColor].CGColor]]]; Is the above line correct? How to create image size? Please help! 回答1: make a UIImage category. such as UIImage(Color) + (UIImage *)imageWithColor:(UIColor *)color andSize:(CGSize)size { UIGraphicsBeginImageContextWithOptions(size, YES, 0); [color set]; UIBezierPath * path =

how to optimized this image processing replace all pixels on image with closest available RGB?

谁都会走 提交于 2019-12-13 01:27:30
问题 Im' trying to replace all pixels of input image with closest available RGB. I have a array contain color and input image. Here is my code, it give me an output image as expected, BUT it take very LONG time( about a min) to process one image. Can anybody help me improve the code? Or if you have any other suggestions, please help. UIGraphicsBeginImageContextWithOptions(CGSizeMake(CGImageGetWidth(sourceImage),CGImageGetHeight(sourceImage)), NO, 0.0f); //Context size I keep as same as original

How to Draw an Image in an NSOpenGLView with Swift?

久未见 提交于 2019-12-12 21:12:22
问题 Basically, I want to create an ImageView which uses OPenGL for rendering. My eventual plan is to use this as a base for a video player with CIFilters. I followed a tutorial which emphasized on using OpenGL technology to take advantage of GPU. The tutorial was for iOS. I mapped it to Cocoa. I have no idea where I am failing, but all I get is a blank screen. Here is the View. import Cocoa import OpenGL.GL3 class CoreImageView: NSOpenGLView { var coreImageContext: CIContext? var image: CIImage?

How to resize CIImage?

眉间皱痕 提交于 2019-12-12 19:52:11
问题 I need to resize more image in one "for" but if I use UIGraphicsGetImageFromCurrentImageContext, I don't have enough memory for more images, because it stay on autorelease and images released when "for" is terminated. I need another method for resize. Any ideas. Thanks -(UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize { CGSize targetSize = newSize; CGSize imageSize = image.size; CGFloat width = imageSize.width; CGFloat height = imageSize.height; CGFloat targetWidth =

CIDetector trackingID never present

夙愿已清 提交于 2019-12-12 15:08:59
问题 I'm working on some face detection code on OSX Mavericks and I'm trying to take advantage of the newish (as of 10.8) face tracking across multiple stills functionality that CIDetector offers. I have basic face detection working fine, like so: - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); CIImage *image =

Speed up UIImage creation from SpriteSheet

扶醉桌前 提交于 2019-12-12 13:25:58
问题 I'm not really sure I got the title exactly right, but I'm not sure where my problem is exactly. I need to load an array of UIImages from an spritesheet, which I'll then use as animations in a UIImageView. The spritesheet is generated with TexturePacker, which generates the huge atlas (2048x2048) and a json with the sprites descriptions. Until now, I've had it working without issues, even loading 100 frames in just 0.5-0.8 secs which I was really happy with. The problem is now I need to load