8

Given a CGDirectDisplayID returned from

CGError error = CGGetActiveDisplayList(8, directDisplayIDs, &displayCount);

for the built-in screen on a Retina MacBook Pro, I would expect to fetch the native pixel dimensions using

size_t pixelWidth = CGDisplayPixelsWide(directDisplayID);
size_t pixelHeight = CGDisplayPixelsHigh(directDisplayID);

However, these calls only return the dimensions of the currently selected mode. If I change screen resolution I get back different values. I was looking to get back 2880 x 1800 on a 15" rMBP.

How do I fetch the native pixel dimensions of a Retina MacBook Pro screen?

Vadim Kotov
  • 8,084
  • 8
  • 48
  • 62
Chris Miles
  • 7,346
  • 2
  • 37
  • 34

6 Answers6

3

I think the best approach is to enumerate all of the display modes (including the 1x modes) and find the biggest 1x mode's dimensions.

You would use CGDisplayCopyAllDisplayModes() and pass a dictionary with the key kCGDisplayShowDuplicateLowResolutionModes mapped to kCFBooleanTrue as the options to get all of the modes. You can test that CGDisplayModeGetPixelWidth() is equal to CGDisplayModeGetWidth() to determine which are 1x.

Ken Thomases
  • 88,520
  • 7
  • 116
  • 154
3

CGDisplayModeGetIOFlags can tell you some information of the display. The native resolutions have kDisplayModeNativeFlag set. The following will set ns to be the native resolution of the current screen of the window win.

CGDirectDisplayID sid = ((NSNumber *)[win.screen.deviceDescription
    objectForKey:@"NSScreenNumber"]).unsignedIntegerValue;
CFArrayRef ms = CGDisplayCopyAllDisplayModes(sid, NULL);
CFIndex n = CFArrayGetCount(ms);
NSSize ns;
for(int i = 0; i < n; ++i){
    CGDisplayModeRef m = (CGDisplayModeRef)CFArrayGetValueAtIndex(ms, i);
    if(CGDisplayModeGetIOFlags(m) & kDisplayModeNativeFlag){
        ns.width = CGDisplayModeGetPixelWidth(m);
        ns.height = CGDisplayModeGetPixelHeight(m);
        break;
    }
}
CFRelease(ms);
jxy
  • 784
  • 7
  • 16
  • Unfortunately, not all displays have resolutions where the `kDisplayModeNativeFlag` bit is set. This is the case for the iMac 5K 2017 built-in display, for example. That's why I'd prefer Ken's answer, maybe in combination with checking the `kDisplayModeNativeFlag` bit. – bfx Apr 01 '20 at 13:25
1

I would go a different route. Instead of finding out the screen dimensions, I would fetch the device model number that the program was being run on and then compare the model number to the dimensions of its screen. This may be tedious to program the model number and corresponding screen size but thats the only way I can think of. Hope this helps.

Yo_Its_Az
  • 2,043
  • 2
  • 18
  • 25
0

If using NSScreen is an option, you could do something like this in OSX 10.7:

NSRect framePixels = [screen convertRectToBacking:[screen frame]];

where framePixels.size is your display's pixel resolution and screen is a pointer to NSScreen. For example, this code would print the pixel resolution of all active displays to console:

for (NSScreen* screen in [NSScreen screens])
{
    NSRect framePixels = [screen convertRectToBacking:[screen frame]];
    NSLog(@"framePixels: (%f, %f)", framePixels.size.width, framePixels.size.height);
}
  • 3
    Unfortunately, this doesn't work either. It only returns the pixel dimensions of the backing store used for the current display mode. Not the physical display's native pixel dimensions. If the mode is changed, a different size is returned. For example, on 15" Retina display, in Retina (1440x900 2x) mode, this returns `framePixels: (2880.000000, 1800.000000)`. However, change mode to largest "More Space" (1920x1200 2x) and it returns `framePixels: (3840.000000, 2400.000000)`. – Chris Miles Feb 13 '13 at 21:40
  • The only thing is - that's actually correct. That's the resolution that the screen is running in! The user set the resolution to that size on purpose. – Chris Sherlock Jan 31 '16 at 03:57
  • 1
    I think you've missed the key bit of information. Here's an example. I have a screen with a native res of 3840 x 2160. the backingScaleFactor is either 1.0f, for no scaling at all, or 2.0f, for any scaling. If I have the system set to a scaling factor of 1.5f (the middle setting in the Display Scaling menu), then this code will tell me my screen is 6016x3384. This is clearly incorrect. – burito Jan 29 '19 at 19:25
0

system_profiler SPDisplaysDataType | grep Resolution:

On a two display machine, I get this output:

      Resolution: 2880 x 1800 Retina
      Resolution: 2560 x 1440 (QHD/WQHD - Wide Quad High Definition)

I heard about this from a similar question: How to get the physical display resolution on MacOS?

theicfire
  • 2,719
  • 2
  • 26
  • 29
0

Modified version of Ken/jxy solution above, by default uses screen frame*backingScaleFactor if no native resolution is found:

static float getScreenScaleFactor() {
NSRect screenFrame = [[NSScreen mainScreen] frame];

// this may well be larger than actual pixel dimensions of screen, as Mac will report the backing scale factor of the render buffer, not the screen, stupidly
float bestWidth = screenFrame.size.width * [[NSScreen mainScreen] backingScaleFactor];

// if there's a native resolution found in this method, that's more accurate than above
CFArrayRef myModes = CGDisplayCopyAllDisplayModes(CGMainDisplayID(), NULL);
for (int i = 0; i < CFArrayGetCount(myModes); i++) {
    CGDisplayModeRef myMode = (CGDisplayModeRef) CFArrayGetValueAtIndex(myModes, i);
    if (CGDisplayModeGetIOFlags(myMode) & kDisplayModeNativeFlag) {
        bestWidth = CGDisplayModeGetPixelWidth(myMode);
        //printf("found native resolution: %i %i\n", (int)CGDisplayModeGetPixelWidth(myMode), (int)CGDisplayModeGetPixelHeight(myMode));
        break;
        }
    }

return bestWidth/screenFrame.size.width;
}
Thomas
  • 485
  • 4
  • 15