Windows: How to test UI under high-dpi?

僤鯓⒐⒋嵵緔 提交于 2019-12-03 10:42:30

If your app's layout behaves the same at 96, 120, 144, 150 dpi then I think there's no need to test it for even higher DPI, since you will have already tested that it works well for uneven dpi increments.

Actually there are many setups high-dpi-friendly already on the market, like 1680x1050 15,4" or 1920x1080 at 16" displays in notebooks, which at 120dpi already show pixel-dependency problems and are pretty uncomfortable to work with at 96dpi already so working on higher-density display support is valid. Good for you!

Edit: I've been thinking. That may not be very real-time, but maybe if you tried handling WM_PRINT or WM_PRINTCLIENT messages in your windows and printed it to a file or at least tried to show a print preview of them using printer settings? Suddenly we're in at least 300dpi. Just an idea.

I've just tried this with VirtualBox can report the following:

  • using a Windows XP host I can't really go beyond about 2000 x 2000, specifying those will simply switch back to 800x600
  • using a Ubuntu 11.04 host I can go to at least 4000x4000.

Since Windows Vista introduced a new driver model, I wouldn't be suprised if Vista/Windows 7 support those high resolutions as well. Unfortunately I don't have a Vista or Windows 7 host to test this myself.

The necessary steps are the following:

  • Switch to scaled display mode (using Host-C, where Host defaults to the right Ctrl key). this will draw a scaled version of the host display, so there's no need to use the RDP-trick. It also ensures that limited window size won't force the VM to reduce the screen resolution
  • Use the command-line tool VBoxManage to specify the resolution hint:

    VBoxManage controlvm "VM Name" setvideomodehint 4000 4000 32
    

You need a video card and a monitor that supports 1920 x 1200. Many users have these, and they're a joy to use if you're a developer. If you have 1600 x 1200 and don't want to spend the money on a new monitor that's fine. Beyond that, unless you're working for Pixar, I don't see the need.

As you're already aware, both the NVidia and ATI display cards allow you to create custom resolutions however never in a million years up to 12800 x 8000. Just to give you an idea of how much memory that would take ... it would require 45 times as much memory as a 1080 (1920x1200) video card. What you could do however is get a big honking rig and chain numerous cards together ... even still ... 12800 x 8000 would be something better suited for customer hardware and drivers under LINUX.

Windows doesn't check to see if your monitor actually measures to the DPI you configure it for, so just attach the biggest monitor you can and start switching the setting.

I'm curious to know why you want to test such high resolutions, i.e. anything over 192. If you have an actual need for such high resolution, surely you have access to the hardware that will be running it?

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!