nsimage

cocoa: Read pixel color of NSImage

那年仲夏 提交于 2020-01-01 11:49:17
问题 I have an NSImage . I would like to read the NSColor for a pixel at some x and y. Xcode seems to thing that there is a colorAtX:y: method on NSImage , but this causes a crash saying that there is no such method for NSImage. I have seen some examples where you create an NSBitmapImageRep and call the same method on that, but I have not been able to successfully convert my NSImage to an NSBitmapImageRep . The pixels on the NSBitmapImageRep are different for some reason. There must be a simple

Getting bounds of an NSImage within an NSImageView

谁说我不能喝 提交于 2020-01-01 09:08:10
问题 I've got an NSImageView that takes up the full extent of a window. There's no border to the image view, and its set to display in the lower left. So this means that the origin of the view matches the origin of actual image, no matter how the window is resized. Also, the image is much larger than what I can reasonably fit at full scale on the screen. So I also have the imageview set to proportionally scale down the size of the image. However, I can't seem to find this scale factor anywhere. My

Memory Continues to Increase when Loading and Releasing NSImage

别说谁变了你拦得住时间么 提交于 2019-12-30 11:22:16
问题 I have a problem where my application aggressively consumes memory to a "thrashing point" with successive image file loads. For example, consider the following code, which repeatedly loads and releases a 15MB JPEG file (large file size for test purposes): NSURL *inputUrl = [NSURL URLWithString:@"file:///Users/me/Desktop/15MBjpeg.jpg"]; for(int i=0; i<1000; i++) { NSImage *image = [[NSImage alloc] initWithContentsOfURL:inputUrl]; [image release]; } It performs quickly for the first several

Can not load NSImageView image from file with NSImage(byReferencingFile: )

白昼怎懂夜的黑 提交于 2019-12-25 18:33:18
问题 When I want to set a image from my disc ( /Users/me/Desktop/image.png ) to the image of my NSImageView with NSImage(byReferencingFile: ) it does not show it. When I try NSImage(named: ) with an image stored in the assets-folder , it works. The code within the viewController class ViewController: NSViewController { @IBOutlet var imageView: NSImageView! override func viewDidLoad() { super.viewDidLoad() imageView.image = NSImage(byReferencingFile: "/Users/me/Desktop/image.png") } } The imageView

MacOS and Swift 3 with CIAffineClamp filter

江枫思渺然 提交于 2019-12-25 08:29:48
问题 I need to use CIAffineClamp in order to extend the image and prevent Gaussian Blur from blurring out edges of the image. I have the following code working in Swift 2: let transform = CGAffineTransformIdentity let clampFilter = CIFilter(name: "CIAffineClamp") clampFilter.setValue(inputImage, forKey: "inputImage") clampFilter.setValue(NSValue(CGAffineTransform: transform), forKey: "inputTransform") In Swift 3 CGAffineTransformIdentity was renamed to CGAffineTransform.identity . My code compiles

Alternative method for canDrawSubviewsIntoLayer prior to OSX 10.9

喜欢而已 提交于 2019-12-24 21:36:05
问题 I needed to implement the following code in my image view to make a gif properly animate: self.homeView.radarImageView.animates = YES; self.homeView.radarImageView.canDrawSubviewsIntoLayer = YES; self.homeView.radarImageView.image = currentData.radarImage; Unfortunately, the canDrawSubviewsIntoLayer method is not compatible with OSX prior to 10.9. Is there an alternative approach that I can use that will make this work in OSX 10.7 and higher? 回答1: avoid layer-based views under 10.8 and below

How come I can't successfully load an NSImage from it's full path? Swift 2

情到浓时终转凉″ 提交于 2019-12-24 16:04:41
问题 I'm trying to load an image from an absolute path into an NSImage and even though the same full path works in other scenarios when I'm using it in this context, the variable just ends up being nil . I've tried using both the file path and an NSURL to achieve it. //: Playground - noun: a place where people can play import Cocoa import AppKit print ("Starting") /** Attempt to do via NSURL **/ // The workspace var workspace = NSWorkspace.sharedWorkspace() // Main screen var screen = NSScreen

NSImage size is wrong

一个人想着一个人 提交于 2019-12-24 13:52:48
问题 I think i'm missing something really basic here. If I do this with a legal URL/path which I know exists: NSImage* img = [[NSImage alloc] initWithContentsOfFile:[[selectedItem url] path]]; NSLog(@"Image width: %d height: %d", [img size].width, [img size].height); then I get reported to the console that the width is -2080177216 and the height 0. Although I know that the width is actually 50 and the height 50. I tried calling isValid and it returns YES, and I also tried checking the size of the

NSImage returns nil even when png file is in Resource folder

时光总嘲笑我的痴心妄想 提交于 2019-12-24 05:06:35
问题 I'm trying to load an image using NSImage imageNamed: method but i have no success i have copied the image in the project folder: Project>Resources and added to the project using: "Add files to project ..." NSImage* image = [NSImage imageNamed:@"logosH"]; if (image == nil){ NSLog(@"NULL :/"); } NSImageView *accessory = [[NSImageView alloc] initWithFrame:NSMakeRect(0,0,200,55)]; [accessory setImageScaling:NSScaleToFit]; [accessory setImage:[NSImage imageNamed:@"logosH.png"]]; [myAlert

Hiding the lupe icon of a NSSearchField

我的梦境 提交于 2019-12-22 11:28:44
问题 I’m using a NSSearchField (not subclassed). When it’s not focused, the placeholderString is centered. But the lupe icon on the left appears a bit offset, so in fact the whole thing appears not to be centered. Is it possible to hide the lupe icon? 回答1: There is not any direct access to the icon, so that a workaround is first access to the NSSearchField cell (casted as shown), and afterwards access to its button cell. self in this example is an instance of NSSearchField [(NSButtonCell *)[