How to read the physical screen size of OSX?

前端 未结 2 1953
长发绾君心
长发绾君心 2020-12-01 06:42

I would like to know the physical screen size under Mac OSX. But NSDeviceResolution is always reporting wrong value (72), so the calculation result of resolutio

相关标签:
2条回答
  • 2020-12-01 07:36

    You can use CGDisplayScreenSize to get the physical size of a screen in millimetres. From that you can compute the DPI given that you already know the resolution.

    So e.g.

    NSScreen *screen = [NSScreen mainScreen];
    NSDictionary *description = [screen deviceDescription];
    NSSize displayPixelSize = [[description objectForKey:NSDeviceSize] sizeValue];
    CGSize displayPhysicalSize = CGDisplayScreenSize(
                [[description objectForKey:@"NSScreenNumber"] unsignedIntValue]);
    
    NSLog(@"DPI is %0.2f", 
             (displayPixelSize.width / displayPhysicalSize.width) * 25.4f); 
             // there being 25.4 mm in an inch
    

    That @"NSScreenNumber" thing looks dodgy but is the explicit documented means of obtaining a CGDirectDisplayID from an NSScreen.

    0 讨论(0)
  • 2020-12-01 07:39

    Tommy’s answer above is excellent — I’ve ported it to Swift (for my own use) and am posting that here as a reference, but Tommy’s should be consider canonical.

    import Cocoa
    
    public extension NSScreen {
        var unitsPerInch: CGSize {
            let millimetersPerInch:CGFloat = 25.4
            let screenDescription = deviceDescription
            if let displayUnitSize = (screenDescription[NSDeviceDescriptionKey.size] as? NSValue)?.sizeValue,
                let screenNumber = (screenDescription[NSDeviceDescriptionKey("NSScreenNumber")] as? NSNumber)?.uint32Value {
                let displayPhysicalSize = CGDisplayScreenSize(screenNumber)
                return CGSize(width: millimetersPerInch * displayUnitSize.width / displayPhysicalSize.width,
                              height: millimetersPerInch * displayUnitSize.height / displayPhysicalSize.height)
            } else {
                return CGSize(width: 72.0, height: 72.0) // this is the same as what CoreGraphics assumes if no EDID data is available from the display device — https://developer.apple.com/documentation/coregraphics/1456599-cgdisplayscreensize?language=objc
            }
        }
    }
    
    if let screen = NSScreen.main {
        print("main screen units per inch \(screen.unitsPerInch)")
    }
    

    Please note that the value returned is kind of effectively the ‘points per inch’ (but not for all definitions; see below) and almost never the ‘pixels per inch’ — modern Macs have a number of pixels per point that depends on the current “Resolution” setting in System Preferences and the inherent resolution of the device (Retina displays have a lot more pixels).

    What you do know about the return value is that if you draw a line with code like

    CGRect(origin: .zero, size: CGSize(width: 10, height: 1)).fill()
    

    the line will be 1 / pointsPerInch.height inches high and 1 / pointsPerInch.width inches wide if you measure it with a very precise ruler held up to your screen.

    (For a long time graphics frameworks have defined a ‘point’ as both “1/72nd of an inch in the real world” and also as “whatever the width or height of a box that’s 1 x 1 units ends up being on the current monitor at the current resolution — two definitions that are usually in conflict with each other.)

    So for this code I use the word ‘unit’ to make it clear we’re not dealing with 1/72nd of an inch, nor 1 physical pixel.

    0 讨论(0)
提交回复
热议问题