cgimageref

NSImage to cv::Mat and vice versa

南楼画角 提交于 2019-12-17 17:59:06
问题 while working with OpenCV I need to convert a NSImage to an OpenCV multi-channel 2D matrix (cvMat) and vice versa. What's the best way to do it? Greets, Dom 回答1: Here's my outcome, which works pretty well. NSImage+OpenCV.h: // // NSImage+OpenCV.h // #import <AppKit/AppKit.h> @interface NSImage (NSImage_OpenCV) { } +(NSImage*)imageWithCVMat:(const cv::Mat&)cvMat; -(id)initWithCVMat:(const cv::Mat&)cvMat; @property(nonatomic, readonly) cv::Mat CVMat; @property(nonatomic, readonly) cv::Mat

Getting NSImage from CGImageRef

旧街凉风 提交于 2019-12-12 08:19:59
问题 I am trying process an image in CoreGraphics and then return the processed image back to an NSImage for saving and displaying. I have ample resources on how to perform these functions in iOS but the helper methods seem to be missing in NSImage . In iOS the class method is imageWithCGImage: , how can you do this in Mac OS? 回答1: The matching method in NSImage is initWithCGImage:size: . The second argument takes the image's size in points. The factor between the size in pixels (of the CGImage)

How can I manipulate the pixel values in a CGImageRef in Xcode

回眸只為那壹抹淺笑 提交于 2019-12-11 05:49:22
问题 I have some CGImageRef cgImage = "something" Is there a way to manipulate the pixel values of this cgImage? For example if this image contains values between 0.0001 and 3000 thus when I try to view or release the image this way in an NSImageView (How can I show an image in a NSView using an CGImageRef image) I get a black image, all pixels are black, I think it has to do with setting the pixel range values in a different color map (I don't know). I want to be able to manipulate or change the

Why AVAssetImageGenerator generateCGImagesAsynchronouslyForTimes crashes the app

[亡魂溺海] 提交于 2019-12-11 05:36:42
问题 I am trying to extract 2 frames per second from a video using generateCGImagesAsynchronouslyForTimes . But my app crashes. I am monitoring the memory usage but its not going any way up than 14 mb. Here is the code: - (void) createImagesFromVideoURL:(NSURL *) videoUrl atFPS: (int) reqiuredFPS completionBlock: (void(^) (NSMutableArray *frames, CGSize frameSize)) block { NSMutableArray *requiredFrames = [[NSMutableArray alloc] init]; AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:videoUrl

How can I show an image in a NSView using an CGImageRef image

前提是你 提交于 2019-12-11 04:16:51
问题 I want to show an image in NSview or in the NSImageView. In my header file I have @interface FVView : NSView { NSImageView *imageView; } @end here is what I been trying to do in my implementation file: - (void)drawRect:(NSRect)dirtyRect { [super drawRect:dirtyRect]; (Here I get an image called fitsImage........ then I do) //Here I make the image CGImageRef cgImage = CGImageRetain([fitsImage CGImageScaledToSize:maxSize]); NSImage *imageR = [self imageFromCGImageRef:cgImage]; [imageR lockFocus]

Getting UIImage from CIImage does not work properly

夙愿已清 提交于 2019-12-10 17:19:18
问题 I am having trouble with getting a UIImage from and CIImage. The line of code below works fine on iOS6: (Output image is an CIImage) self.imageView = [UIImage imageWithCIImage:outputImage]; or [self.imageView setImage:[UIImage imageWithCIImage:outputImage]]; When I run this same line of code on a device that is running iOS 5 the imageView is blank. If I log the size property of the UIImage it is correct but the image never displays on the screen. When I use a CGImageRef (as shown below) it

ios, can't get correct pixel color of CGImageRef

∥☆過路亽.° 提交于 2019-12-08 14:39:02
问题 I made a context, then a gradient. After that I am drawing to the context. Then I receive an gradiented image from context and it is correct, I can see it in the debugger. But when I try to get a pixel color for a specific value it isn't correct. I appreciate any help, thanks. Here is the code. - (UIColor*) getColorForPoint: (CGPoint) point imageWidth: (NSInteger) imageHeight { CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); // create a gradient CGPoint startPoint = CGPointMake(0,

CGImageCreate Test Pattern is not working (iOS)

感情迁移 提交于 2019-12-08 08:03:43
问题 I'm trying to create an UIImage test pattern for an iOS 5.1 device. The target UIImageView is 320x240 in size, but I was trying to create a 160x120 UIImage test pattern (future, non-test pattern images will be this size). I wanted the top half of the box to be blue and the bottom half to be red, but I get what looks like uninitialized memory corrupting the bottom of the image. The code is as follows: int width = 160; int height = 120; unsigned int testData[width * height]; for(int k = 0; k <

IOS: How to split an UIImage into parts

青春壹個敷衍的年華 提交于 2019-12-08 04:13:53
问题 In one of my application I need to split UIImage into multiple parts. The following was the code I am using to split. Here my problem is I am unable to load the image view by adding the image to UIImageView. - (void)viewDidLoad { UIImage* image = [UIImage imageNamed:@"monalisa.png"]; NSMutableArray* splitImages = [self splitImageIntoRects:(__bridge CGImageRef)(image)]; printf("\n count; %d",[splitImages count]); CALayer *layer = [splitImages objectAtIndex:5]; CGImageRef imgRef = (__bridge

IOS: How to split an UIImage into parts

泪湿孤枕 提交于 2019-12-08 02:45:40
In one of my application I need to split UIImage into multiple parts. The following was the code I am using to split. Here my problem is I am unable to load the image view by adding the image to UIImageView. - (void)viewDidLoad { UIImage* image = [UIImage imageNamed:@"monalisa.png"]; NSMutableArray* splitImages = [self splitImageIntoRects:(__bridge CGImageRef)(image)]; printf("\n count; %d",[splitImages count]); CALayer *layer = [splitImages objectAtIndex:5]; CGImageRef imgRef = (__bridge CGImageRef)(layer.contents); UIImage *img = [[UIImage alloc] initWithCGImage:imgRef]; UIImageView*