Discrepancy between command line XRandR and own code

不羁岁月 提交于 2020-01-06 03:14:11

问题


I need to programatically get the refresh rate of a monitor.

When I type xrandr (1.4.1, opensuse 13) on the command line I get:

Screen 0: minimum 8 x 8, current 1920 x 1200, maximum 16384 x 16384
VGA-0 disconnected primary (normal left inverted right x axis y axis)
DVI-D-0 connected 1920x1200+0+0 (normal left inverted right x axis y axis) 518mm x 324mm
   1920x1200      60.0*+
   1920x1080      60.0
   1680x1050      60.0
   1600x1200      60.0
   1280x1024      60.0
   1280x960       60.0
   1024x768       60.0
   800x600        60.3
   640x480        59.9
HDMI-0 disconnected (normal left inverted right x axis y axis)

This result is confirmed by nvidia-settings -q RefreshRate, among other things.

But ... when I run the following code (origin: https://github.com/raboof/xrandr/blob/master/xrandr.c), compiled with g++ 4.8.1 (with -lX11 -lXext -lXrandr) :

int nsize;
int nrate;
short *rates;
XRRScreenSize *sizes;
Display *dpy = XOpenDisplay(NULL);
Window root = DefaultRootWindow(dpy);

XRRScreenConfiguration *conf = XRRGetScreenInfo(dpy, root);
printf ("Current rate: %d\n",XRRConfigCurrentRate(conf));

sizes = XRRConfigSizes(conf, &nsize);
printf(" SZ:    Pixels          Refresh\n");
for (int i = 0; i < nsize; i++) {
    printf("%-2d %5d x %-5d", i, sizes[i].width, sizes[i].height);
    rates = XRRConfigRates(conf, i, &nrate);
    if (nrate)
        printf("  ");
    for (int j = 0; j < nrate; j++)
        printf("%-4d", rates[j]);
    printf("\n");
}

XRRFreeScreenConfigInfo(conf);

I get:

Current rate: 50
SZ:    Pixels       Refresh
0   1920 x 1200   50
1   1920 x 1080   51
2   1680 x 1050   52
3   1600 x 1200   53
4   1280 x 1024   54
5   1280 x 960    55
6   1024 x 768    56
7    800 x 600    57
8    640 x 480    58
9   1440 x 900    59
10  1366 x 768    60
11  1280 x 800    61
12  1280 x 720    62

Why am I getting this result? What I am doing wrong?

The software uses OpenGL with GLEW. can this have any influence? We do call glXQueryDrawable(dpy, drawable, GLX_SWAP_INTERVAL_EXT, &val) but afterwards, and I do not think this should have any influence.


回答1:


I found the answer:

If the XRandR sever supports version 1.2 of the protocol, then the appropriate functions need to be used (wich I plan to do by copying snippets of code from https://github.com/raboof/xrandr/blob/master/xrandr.c where has_1_2 is true).

My code in the question uses functions for the version 1.1 of the protocol, and therefore only the metamodes are returned.

As a simple check, I tried the following two commands:

xrandr --q1

xrandr --q12.

And indeed the 1st one gives me the same result I programatically get.

Credits go to http://www.ogre3d.org/forums/viewtopic.php?f=4&t=65010&start=200



来源:https://stackoverflow.com/questions/37995551/discrepancy-between-command-line-xrandr-and-own-code

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!