Oversampling, or scaling down
Oversampling, or scaling down
Posted Nov 12, 2014 15:31 UTC (Wed) by epa (subscriber, #39769)Parent article: High-DPI displays and Linux
You can also set this up on Windows with some Nvidia cards; again you render to a higher resolution and then the card scales it down for the display. This can be used by obsessive gamers to get the highest possible video quality (a bigger image scaled down will have, in effect, the best possible anti-aliasing) but could also be useful when your monitors have a mixture of dots-per-inch and you want things to appear roughly the same size on each.
Is this kind of oversampling supported on Linux? Could the editor's laptop 2560x1440 display be driven as a virtual 3840x2160 framebuffer and then scaled by 2/3 in hardware?
Posted Nov 12, 2014 15:47 UTC (Wed)
by juhah (subscriber, #32930)
[Link] (6 responses)
Posted Nov 12, 2014 17:41 UTC (Wed)
by grunch (subscriber, #16603)
[Link]
I had previously used the GNOME tweak tool to change the scaling factor to 2, but gave up too quickly when I encountered the huge magnification issue. And I hadn't even explored the use of xrandr to set the screen scale for the laptop display. Thanks for the enlightening article and comments: life's getting back to "normal" for me with this device!
Posted Nov 13, 2014 8:50 UTC (Thu)
by jengelh (guest, #33263)
[Link] (2 responses)
Posted Nov 13, 2014 12:41 UTC (Thu)
by tcourbon (guest, #60669)
[Link] (1 responses)
Posted Nov 15, 2014 18:03 UTC (Sat)
by jospoortvliet (guest, #33164)
[Link]
Posted Nov 13, 2014 10:01 UTC (Thu)
by arekm (guest, #4846)
[Link] (1 responses)
To "downgrade" 3200x1800 to 1920x1080 on dell xps 15 I need to use:
$ cat /etc/X11/kdm/Xsetup
/usr/bin/xrandr --dpi 141
Unfortunately:
so the only option now is to "downgrade" to 1920x1080
Posted Nov 13, 2014 16:20 UTC (Thu)
by xbobx (subscriber, #51363)
[Link]
I think Tk will correctly honor the DPI by default, but ignore DPI if a font size is a negative value. I know this because years ago after a distro update tkdiff started to render at different sizes on my different-DPI monitors, rather than scaling to approximately the same size. I tracked it down to a "bugfix" where somebody changed the default font size to a negative size in order to work around situations where the DPI was set incorrectly (sigh).
Posted Nov 12, 2014 16:42 UTC (Wed)
by ken (subscriber, #625)
[Link] (10 responses)
perhaps if you insist of running in 1920x1080 it renders in double and then scale back into 2880x1800. but that mode is not something I think anybody would actually want to use.
Posted Nov 12, 2014 17:10 UTC (Wed)
by epa (subscriber, #39769)
[Link] (9 responses)
I had assumed this was the default setup but that may not be the case, sorry.
Posted Nov 12, 2014 21:07 UTC (Wed)
by luto (guest, #39314)
[Link] (8 responses)
Posted Nov 13, 2014 2:14 UTC (Thu)
by danielkza (subscriber, #66161)
[Link] (5 responses)
Posted Nov 13, 2014 2:22 UTC (Thu)
by luto (guest, #39314)
[Link] (4 responses)
If an application thinks it's rendering a 100px x 100px window, then presumably all the drawing APIs need to act as though it really is rendering a 100px x 100px window. I assume that this means that applications render into an intermediate buffer, which is then scaled.
Now if an application wants to draw text, and that text will render at higher resolution, and the drawing API won't freak out about the pixels having subpixels, then I can imagine that rendering into a buffer with an integer multiple higher resolution could work, and that downscaling makes sense.
If not, then I still don't see why scaling 100x100 to 200x200 and then to 150x150 makes sense.
Posted Nov 13, 2014 2:35 UTC (Thu)
by danielkza (subscriber, #66161)
[Link] (3 responses)
Posted Nov 13, 2014 2:38 UTC (Thu)
by luto (guest, #39314)
[Link] (1 responses)
Posted Nov 13, 2014 15:41 UTC (Thu)
by zyga (subscriber, #81533)
[Link]
This is distinctively different from scaling individual drawing operations by 1.5
Posted Nov 15, 2014 20:11 UTC (Sat)
by javispedro (guest, #83660)
[Link]
On a more serious matter: With Wayland, Gtk+ becomes the owner of the surface, and thus the problem of how you actually render all of your non-integer cordinate widgets becomes Gtk's problem (or OpenGL's).
So this excuse no longer works in a Wayland scenario. Qt has non-integer scaling ratios and they work quite OK.
Posted Nov 13, 2014 11:46 UTC (Thu)
by epa (subscriber, #39769)
[Link]
An integer scaling factor of 2 is much easier to handle: an icon can be exactly pixel-doubled and still look clean. It won't be high res, of course, but at least a one-pixel line in the original will always be exactly two pixels wide in the scaled version. (By contrast, if you scale by 1.5 using a not-very-intelligent algorithm, a one-pixel line could end up either one or two pixels wide depending on its position in the image.)
Then once the application has rendered at 2x scaling, which most are able to manage at least passably, the GPU scales everything by 0.75 using a reasonably good scaling method which anti-aliases and doesn't suffer artefacts from rounding to exact pixel boundaries.
At least, the above is my conjecture based on the fact that Apple chose to do it this way (at least when a 'virtual res' of 1920x1200 is selected). It matches my experience using high-dpi displays in Windows, where a font scaling of 150% looks nasty in many apps but 200% works well enough.
Posted Nov 14, 2014 6:44 UTC (Fri)
by ploxiln (subscriber, #58395)
[Link]
So when they're scaled from 2x to 1.5x, they have extra detail in the 2x pixels, which is then combined/sampled for a much smoother 1.5x scale from the "reference pixel size".
It's a pretty clever way to get more applications to more easily support reasonably Hi-DPI display, IMHO. Just support 2x, and with all that detail the OS can scale that a bit and still have it look good.
Posted Nov 13, 2014 3:22 UTC (Thu)
by roc (subscriber, #30627)
[Link] (8 responses)
Posted Nov 13, 2014 11:46 UTC (Thu)
by alankila (guest, #47141)
[Link]
Posted Nov 13, 2014 11:51 UTC (Thu)
by epa (subscriber, #39769)
[Link] (6 responses)
However there are many legacy applications whose developers never thought about the issue of scaling (let alone by fractional amounts) and which were never tested under such a setup. It would be a mistake to think that you can set a factor of 1.5 and have all of them just work. Perhaps most would, but those that don't handle it properly will turn out fugly or totally unusable.
Therefore, the approach that maximizes compatibility with older software is always to scale by integer multiples. Even then, some older programs misrender (perhaps rendering one part of the window at scaled size while another part is unscaled), but most of the time you get a usable result.
Posted Nov 14, 2014 10:04 UTC (Fri)
by roc (subscriber, #30627)
[Link] (5 responses)
Posted Nov 18, 2014 5:32 UTC (Tue)
by quotemstr (subscriber, #45331)
[Link] (4 responses)
Fractional scaling in practice works fine; thanks for implementing it in Firefox. I hope the Wayland people come around: for some hardware (like mine), fractional scaling is precisely the right thing. Not everyone owns a Macbook.
If the Wayland developers persist in supporting only integer scaling factors, I suspect desktop environments will just hardcode 1x scaling and make toolkits do fractional scaling, which will be a mess all around.
Posted Nov 19, 2014 9:28 UTC (Wed)
by epa (subscriber, #39769)
[Link] (3 responses)
On Linux, the legacy programs may often be ones using X11 server-side fonts. Currently X ships with 75dpi and 100dpi bitmap fonts. If it included 150dpi and 200dpi sets, programs like xterm could use those and so scale up nicely.
Posted Nov 20, 2014 21:48 UTC (Thu)
by quotemstr (subscriber, #45331)
[Link] (2 responses)
Posted Nov 20, 2014 22:16 UTC (Thu)
by dlang (guest, #313)
[Link]
A lot of web designers have been working hard ever since to undermine this design, but they have had limited success
Posted Nov 21, 2014 11:26 UTC (Fri)
by epa (subscriber, #39769)
[Link]
Grab a PC running Windows 7 box and set it to 150% font scaling, then try a random assortment of third party software written before 2010. You will see the mess I am talking about - and I assume the situation with older Mac OS X programs is the same.
It is great that Firefox scales well and I quite agree it proves that scaling by an arbitrary factor is *not difficult*. That has very little bearing on whether existing programs, which exist in binary only form and cannot be modified, implement non-integer scaling reasonably. Even if 90% do so, ten per cent do not.
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
#! /bin/sh
# Xsetup - run as root before the login dialog appears
/usr/bin/xrandr --output eDP1 --scale 0.6x0.6
- tk doesn't suppor HiDPI at all (http://core.tcl.tk/tk/tktview/0917069c05e9c354a27c8105a7a...)
- same for GRUB2 (http://savannah.gnu.org/bugs/?42525)
- same for google-chrome/chromium-browser (https://code.google.com/p/chromium/issues/detail?id=143619)
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Thank you. It's frustrating how some people say that "$X is impossible!", and then when confronted with an existence proof of $X, just reiterate the claim that "$X is impossible!".
Oversampling, or scaling down
I think the point is that there's an existence proof of many legacy programs from the 2000s and 1990s which produce a nasty-looking mess when asked to scale by 3/2 (let alone, say, 4/3). That does not negate the existence of a large body of carefully written programs (of which Firefox is one) which handle arbitrary scaling perfectly. But it is (I presume) the reason why Apple chose to scale by an integer factor and then resize in the GPU if necessary. I quite agree, if starting from scratch then arbitrary dpi need to be supported.
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down