|
|
Subscribe / Log in / New account

High-DPI displays and Linux

By Jonathan Corbet
November 12, 2014
Your editor's laptop recently started to fall apart, to the point that some sort of action was required. Duct tape was considered as a solution; it would show that LWN is making careful use of its subscription revenues, but would also convey an impression that is, arguably, not 100% professional. The alternative was to get a new laptop; that costs money, but the idea turned out to be awfully shiny and distracting. So your editor ended up with a ThinkPad X1 Carbon. This machine has a number of nice features and a few weird quirks, but one of the more interesting options it came with is a high-DPI, 2560x1440 display. That was an ideal opportunity to investigate the state of Linux support of high-DPI screens; it turns out that, while quite a bit of progress has been made, this problem has not yet been fully solved.

There are certainly a lot of advantages to high-DPI displays (generally considered to be those with at least 200 pixels per linear inch). Text is rendered in an especially clear and readable fashion. Photos from digital cameras can be displayed at something resembling their native resolution. And, for those of us with sufficiently sharp eyes, high-DPI allows more work to be crammed onto a screen, increasing productivity. But, if the desktop environment and associated applications are not aware that a high-DPI screen is in use, the result can be unreadable text and unusable, microscopic screen elements.

High-DPI display support

So how have the Linux desktop environments responded to high-DPI displays? There appear to be two different basic approaches. The one used by GNOME and some web browsers is to hide the display resolution behind pixel scaling; in essence, the environment lies to applications, then turns each application pixel into a 2x2 array of screen pixels. It's a relatively quick solution, but, as we'll see, it leads to some unfortunate behaviors.

The alternative is to have applications deal with display-independent units (points, centimeters, inches, whatever) and hide the scaling required. KDE appears to have taken this approach. It seems to work well enough as long as applications play along, but here, too, the problem is not entirely solved.

Your editor's experiments have mostly been with GNOME, so this article will have an unavoidable bias in that direction. In GNOME, pixel scaling is managed with the inevitable gsettings registry value; to turn on 2-to-1 scaling, a command like this can be used:

    gsettings set org.gnome.desktop.interface scaling-factor 2

Setting the scaling factor back to one turns off scaling. Scaling should be enabled by default if GNOME detects a high-DPI display; otherwise a command like the above can be used. This value can also be adjusted in gnome-tweak-tool. Should you run into bugs or weirdness (like only a quarter of the screen being used), setting the scaling factor to one then back to two will often straighten them out; your editor has ended up creating a fixscaling script for just this eventuality.

Firefox implements a similar sort of pixel scaling, but it doesn't seem to figure out when to use it on its own. The solution is to go into about:config and set the layout.css.devPixelsPerPx knob to a value like two. Alternatively, one can make things work by configuring the use of 30-point fonts, but that will make a mess of a lot of web sites.

In the KDE world, there seem to be a couple of parameters to set if the desktop doesn't figure things out on its own. The "application appearance" dialog has an option to set the dots-per-inch resolution used for font rendering; setting that to a value close to your screen's pixel density (or lower; 150 seems to be a recommended value) will make text readable. The size of icons is also nicely configurable in KDE; its use of SVG icons helps to make this option work well. Obviously, one will want to make icons larger on high-DPI displays.

Difficulties

So it seems that some support for high-DPI displays is in place for Linux. But there are still a lot of nagging little details — some perhaps not so little — that pop up when using a system with such a display.

For example, what happens when one plugs an external monitor (or projector) into such a system? That display almost certainly is not also high-DPI. The techniques used to make the high-DPI screen will have the effect of turning the lower-DPI screen into an extra-low-DPI screen. In your editor's case, this problem has made the use of external monitors on the new laptop nearly impossible. The word is that this behavior is the results of limitations built deeply into the X Window System which, after all, was developed when 75dpi was a high-DPI screen. Things will be much better under Wayland, we are told, but there doesn't appear to be a lot of relief in store before then.

Pixel scaling can also cause a lot of confusion with applications that are not prepared for it. Here is an old version of gthumb (the version shipped with Fedora 20) with scaling in effect:

[Gthumb scaled]

With scaling turned off, the result is rather more satisfying:

[Gthumb unscaled]

(It should be noted that gthumb 3.14.x handles high-DPI displays in a much better manner. Your editor will reserve comment on some of the other user interface changes made during that time, though.)

Over time your editor has found a few applications that do better with pixel scaling turned off. Happily, there is an easy way to do that on a per-application basis: simply set the GDK_SCALE environment variable to the desired scaling value.

Another problem with pixel scaling is that it can cause even the high-DPI screen to appear to have a low pixel density. As a result, for example, gnome-terminal believes it's running with 7-point fonts, something that would normally be illegibly small. The sizing is confusing, but it also makes it hard to get that perfect size: seven points is too large, but a 6-point font, being quite a bit smaller, is illegible. All of this is somewhat unfortunate, since points are supposed to be real-world physical dimensions, but that connection has been lost for now.

There is no shortage of web sites that come through badly on high-DPI displays — including this one in many cases. User-interface elements can be so small as to be unusable, and images are reduced down to postage stamps. Xkcd becomes unreadable in such a setting. Turning on pixel scaling in Firefox makes a lot of these problems go away at the cost of losing a fair amount of screen space to the expanded icon bars at the top — and to larger advertisements. The Chromium browser, surprisingly, appears to have no high-DPI awareness at all at the moment; the sad result is that even Google sites can look terrible there.

In general, any web site that is trying to manage its page layout using pixel sizes is going to run into trouble on high-DPI displays. Fixing that problem will take a long time, and involves some interesting tradeoffs. The tiny-image problem can be fixed by shipping higher-resolution images and getting the browser to scale them to the desired size, but that can increase bandwidth consumption considerably. The alternative — sensing the screen resolution and shipping appropriately sized images — is a lot more work to support.

Some closing thoughts

Naturally, the new machine came with Windows preinstalled, so your editor had the chance to see how that system copes with the high-DPI display. The situation there is perhaps a bit better, but Windows has not fully solved the problem either. The basic system user interface works well, but web pages can run into trouble there, and some applications, even popular, high-profile applications, have not yet made the adjustment.

Making applications work properly in an environment where displays can have widely varying densities — even on the same system — is not an easy problem to solve. The good news is that we are much of the way there; the bad news is that there is still a lot to be done, and, in some cases, a change of approach may be needed. In particular, approaches based on pixel scaling have all the appearance of being a short-term kludge that gets things working while a real solution is being worked on.

That real solution, it seems, almost has to involve divorcing applications from the individual pixels on the screen. Once applications are working in pixel-independent units almost all of the time (there may need to be exceptions for image editing, video playback, etc.), the underlying environment can work on rendering in a visually similar way on all possible screens.

Until then, it is good to see that developers are working on making high-DPI screens work better with Linux. A fast rate of progress is arguably not surprising; after all, desktop developers hardly seem like a crowd that would be resistant to the allure of a beautiful new screen. So we can probably count on those developers to fix up the remaining problems in relatively short order.


to post comments

Oversampling, or scaling down

Posted Nov 12, 2014 15:31 UTC (Wed) by epa (subscriber, #39769) [Link] (27 responses)

One way to get the sizing you want is to render at a higher pixel size than the real size of the display, and then scale everything down (in hardware) to fit. For example, the 15" Macbook Pro, in its default configuration with Mac OS X, renders at 3840x2400 and then scales by 0.75 to output at 2880x1800. This is done so that on the software side everything can be exactly doubled, giving 'screen real estate' equivalent to an ordinary 1920x1200 display. Getting the applications to render at 1.5x scaling would cause all sorts of pixel jaggies and other nastiness; 2x is much easier to manage and then the video hardware smoothly scales down.

You can also set this up on Windows with some Nvidia cards; again you render to a higher resolution and then the card scales it down for the display. This can be used by obsessive gamers to get the highest possible video quality (a bigger image scaled down will have, in effect, the best possible anti-aliasing) but could also be useful when your monitors have a mixture of dots-per-inch and you want things to appear roughly the same size on each.

Is this kind of oversampling supported on Linux? Could the editor's laptop 2560x1440 display be driven as a virtual 3840x2160 framebuffer and then scaled by 2/3 in hardware?

Oversampling, or scaling down

Posted Nov 12, 2014 15:47 UTC (Wed) by juhah (subscriber, #32930) [Link] (6 responses)

xrandr --output <out> --scale 0.75x0.75

Oversampling, or scaling down

Posted Nov 12, 2014 17:41 UTC (Wed) by grunch (subscriber, #16603) [Link]

This article came at a most fortuitous moment. I recently acquired a Lenovo W540, and was on the verge of using it as mainly a compile server for my Gentoo boxes. Between the pixel density and the silly "click pad" thingy I was pretty frustrated.

I had previously used the GNOME tweak tool to change the scaling factor to 2, but gave up too quickly when I encountered the huge magnification issue. And I hadn't even explored the use of xrandr to set the screen scale for the laptop display. Thanks for the enlightening article and comments: life's getting back to "normal" for me with this device!

Oversampling, or scaling down

Posted Nov 13, 2014 8:50 UTC (Thu) by jengelh (guest, #33263) [Link] (2 responses)

xrandr --dpi 120

Oversampling, or scaling down

Posted Nov 13, 2014 12:41 UTC (Thu) by tcourbon (guest, #60669) [Link] (1 responses)

This never seemed to work for me since it didn't change anything on display when I used to try it. It looked like xrandr did not took the new value in account.

Oversampling, or scaling down

Posted Nov 15, 2014 18:03 UTC (Sat) by jospoortvliet (guest, #33164) [Link]

Xrandr works, but I believe GNOME ignores dpi settings. KDE apps obey it so you get a message if you use both. But you can configure dpi for KDE apps, not sure about GNOME.

Oversampling, or scaling down

Posted Nov 13, 2014 10:01 UTC (Thu) by arekm (guest, #4846) [Link] (1 responses)

To "downgrade" 3200x1800 to 1920x1080 on dell xps 15 I need to use:

$ cat /etc/X11/kdm/Xsetup
#! /bin/sh
# Xsetup - run as root before the login dialog appears

/usr/bin/xrandr --dpi 141
/usr/bin/xrandr --output eDP1 --scale 0.6x0.6

Unfortunately:
- tk doesn't suppor HiDPI at all (http://core.tcl.tk/tk/tktview/0917069c05e9c354a27c8105a7a...)
- same for GRUB2 (http://savannah.gnu.org/bugs/?42525)
- same for google-chrome/chromium-browser (https://code.google.com/p/chromium/issues/detail?id=143619)

so the only option now is to "downgrade" to 1920x1080

Oversampling, or scaling down

Posted Nov 13, 2014 16:20 UTC (Thu) by xbobx (subscriber, #51363) [Link]

> tk doesn't suppor HiDPI at all

I think Tk will correctly honor the DPI by default, but ignore DPI if a font size is a negative value. I know this because years ago after a distro update tkdiff started to render at different sizes on my different-DPI monitors, rather than scaling to approximately the same size. I tracked it down to a "bugfix" where somebody changed the default font size to a negative size in order to work around situations where the DPI was set incorrectly (sigh).

Oversampling, or scaling down

Posted Nov 12, 2014 16:42 UTC (Wed) by ken (subscriber, #625) [Link] (10 responses)

As a user of a mac book pro 15 retina I seriously doubt that what you describe is actually happening. It renders directly to 2880x1800 and no additional scaling. that display resolution was done so that you simply double the 1440x600, exactly so you can avoid any fractional scaling values.

perhaps if you insist of running in 1920x1080 it renders in double and then scale back into 2880x1800. but that mode is not something I think anybody would actually want to use.

Oversampling, or scaling down

Posted Nov 12, 2014 17:10 UTC (Wed) by epa (subscriber, #39769) [Link] (9 responses)

If you set the resolution in system preferences to 1920x1200 then it renders to 3840x2400 and scales by 0.75 - see for example http://forums.macrumors.com/showthread.php?t=1665672

I had assumed this was the default setup but that may not be the case, sorry.

Oversampling, or scaling down

Posted Nov 12, 2014 21:07 UTC (Wed) by luto (guest, #39314) [Link] (8 responses)

How does scaling non-high-dpi-aware applications up by a factor of 2 and then down by a factor of 0.75 produce better, let alone different, results from just scaling it up by a factor of 1.5 in the first place?

Oversampling, or scaling down

Posted Nov 13, 2014 2:14 UTC (Thu) by danielkza (subscriber, #66161) [Link] (5 responses)

Many applications can't render fractional pixel sizes, they need integer multipliers, which forces you to render at 2x then scale-down later.

Oversampling, or scaling down

Posted Nov 13, 2014 2:22 UTC (Thu) by luto (guest, #39314) [Link] (4 responses)

Huh?

If an application thinks it's rendering a 100px x 100px window, then presumably all the drawing APIs need to act as though it really is rendering a 100px x 100px window. I assume that this means that applications render into an intermediate buffer, which is then scaled.

Now if an application wants to draw text, and that text will render at higher resolution, and the drawing API won't freak out about the pixels having subpixels, then I can imagine that rendering into a buffer with an integer multiple higher resolution could work, and that downscaling makes sense.

If not, then I still don't see why scaling 100x100 to 200x200 and then to 150x150 makes sense.

Oversampling, or scaling down

Posted Nov 13, 2014 2:35 UTC (Thu) by danielkza (subscriber, #66161) [Link] (3 responses)

What happens with a 101x101px window that gets scaled to 151,5x151,5 pixels? X11 does not handle fractional window sizes. The same will apply to subwindows and controls, to window decorations, and maybe even other things. Fractional scaling in itself is definitely not impossible, but it is mostly impractical due to all the corner cases it might produce.

Oversampling, or scaling down

Posted Nov 13, 2014 2:38 UTC (Thu) by luto (guest, #39314) [Link] (1 responses)

But the alleged Apple approach of scaling up to 2x and then down to 0.75x has exactly the same problem, right?

Oversampling, or scaling down

Posted Nov 13, 2014 15:41 UTC (Thu) by zyga (subscriber, #81533) [Link]

No, because you scale up by x2 and there are no fractional parts to worry about so you get good picture. Then you scale the entire full-screen buffer by 0.75 to get another picture.

This is distinctively different from scaling individual drawing operations by 1.5

Oversampling, or scaling down

Posted Nov 15, 2014 20:11 UTC (Sat) by javispedro (guest, #83660) [Link]

But no one uses X11 subwindows any longer, or so we heard from Wayland developers! </ironic>

On a more serious matter: With Wayland, Gtk+ becomes the owner of the surface, and thus the problem of how you actually render all of your non-integer cordinate widgets becomes Gtk's problem (or OpenGL's).

So this excuse no longer works in a Wayland scenario. Qt has non-integer scaling ratios and they work quite OK.

Oversampling, or scaling down

Posted Nov 13, 2014 11:46 UTC (Thu) by epa (subscriber, #39769) [Link]

If you ask the application to scale by 1.5 then it may produce nasty jaggies in resized icons because they're being crudely scaled up with nearest-neighbour (not anti-aliased). Or it may end up with off-by-one drawing mistakes: if one box is 101 pixels wide and it gets scaled to 151.5, do you round up or down? The drawing code doesn't get much testing under odd scaling factors so it is unlikely to just work.

An integer scaling factor of 2 is much easier to handle: an icon can be exactly pixel-doubled and still look clean. It won't be high res, of course, but at least a one-pixel line in the original will always be exactly two pixels wide in the scaled version. (By contrast, if you scale by 1.5 using a not-very-intelligent algorithm, a one-pixel line could end up either one or two pixels wide depending on its position in the image.)

Then once the application has rendered at 2x scaling, which most are able to manage at least passably, the GPU scales everything by 0.75 using a reasonably good scaling method which anti-aliases and doesn't suffer artefacts from rounding to exact pixel boundaries.

At least, the above is my conjecture based on the fact that Apple chose to do it this way (at least when a 'virtual res' of 1920x1200 is selected). It matches my experience using high-dpi displays in Windows, where a font scaling of 150% looks nasty in many apps but 200% works well enough.

Oversampling, or scaling down

Posted Nov 14, 2014 6:44 UTC (Fri) by ploxiln (subscriber, #58395) [Link]

On OS X, Hi-DPI capable applications render at twice the quality when rendering 2x (in each dimension of course). They're not really "resolution independent", they don't support arbitrary dpi, they just do 1x or 2x. Virtually all applications now at least support 2x text through OS X's CoreText or whatever, and most render fully at 2x quality.

So when they're scaled from 2x to 1.5x, they have extra detail in the 2x pixels, which is then combined/sampled for a much smoother 1.5x scale from the "reference pixel size".

It's a pretty clever way to get more applications to more easily support reasonably Hi-DPI display, IMHO. Just support 2x, and with all that detail the OS can scale that a bit and still have it look good.

Oversampling, or scaling down

Posted Nov 13, 2014 3:22 UTC (Thu) by roc (subscriber, #30627) [Link] (8 responses)

As the developer responsible for high-DPI layout and rendering in Firefox for several years, contrary to my expectations it has turned out that scaling by fractional amounts such as 1.5x is not significantly worse/harder than scaling by an integer.

Oversampling, or scaling down

Posted Nov 13, 2014 11:46 UTC (Thu) by alankila (guest, #47141) [Link]

Yeah, I'd say that if you have a regularity in the expansion then the problem is far simpler. e.g 1.5x scaling implies that each 2 pixels turns into 3 pixels, so you will have fairly uniform result without too prominent moire-like artifacts over it. But as the period grows longer, then I predict that the associated problems become more severe as well.

Oversampling, or scaling down

Posted Nov 13, 2014 11:51 UTC (Thu) by epa (subscriber, #39769) [Link] (6 responses)

It's not much harder to scale by 1.5. You have thought about it, written code which handles it, and at done at least some basic testing. As you say it's not that big a deal.

However there are many legacy applications whose developers never thought about the issue of scaling (let alone by fractional amounts) and which were never tested under such a setup. It would be a mistake to think that you can set a factor of 1.5 and have all of them just work. Perhaps most would, but those that don't handle it properly will turn out fugly or totally unusable.

Therefore, the approach that maximizes compatibility with older software is always to scale by integer multiples. Even then, some older programs misrender (perhaps rendering one part of the window at scaled size while another part is unscaled), but most of the time you get a usable result.

Oversampling, or scaling down

Posted Nov 14, 2014 10:04 UTC (Fri) by roc (subscriber, #30627) [Link] (5 responses)

I mean Firefox scales arbitrary Web pages by fractional scales and they almost always look fine. Those Web pages use complex layouts and generally were not tested over a range of scale factors.

Oversampling, or scaling down

Posted Nov 18, 2014 5:32 UTC (Tue) by quotemstr (subscriber, #45331) [Link] (4 responses)

Thank you. It's frustrating how some people say that "$X is impossible!", and then when confronted with an existence proof of $X, just reiterate the claim that "$X is impossible!".

Fractional scaling in practice works fine; thanks for implementing it in Firefox. I hope the Wayland people come around: for some hardware (like mine), fractional scaling is precisely the right thing. Not everyone owns a Macbook.

If the Wayland developers persist in supporting only integer scaling factors, I suspect desktop environments will just hardcode 1x scaling and make toolkits do fractional scaling, which will be a mess all around.

Oversampling, or scaling down

Posted Nov 19, 2014 9:28 UTC (Wed) by epa (subscriber, #39769) [Link] (3 responses)

I think the point is that there's an existence proof of many legacy programs from the 2000s and 1990s which produce a nasty-looking mess when asked to scale by 3/2 (let alone, say, 4/3). That does not negate the existence of a large body of carefully written programs (of which Firefox is one) which handle arbitrary scaling perfectly. But it is (I presume) the reason why Apple chose to scale by an integer factor and then resize in the GPU if necessary. I quite agree, if starting from scratch then arbitrary dpi need to be supported.

On Linux, the legacy programs may often be ones using X11 server-side fonts. Currently X ships with 75dpi and 100dpi bitmap fonts. If it included 150dpi and 200dpi sets, programs like xterm could use those and so scale up nicely.

Oversampling, or scaling down

Posted Nov 20, 2014 21:48 UTC (Thu) by quotemstr (subscriber, #45331) [Link] (2 responses)

Firefox manages to render plenty of legacy websites that were not designed to be resolution-independent and scale them by fractional values. *That's* the point.

Oversampling, or scaling down

Posted Nov 20, 2014 22:16 UTC (Thu) by dlang (guest, #313) [Link]

HTML is designed to be resolution independent

A lot of web designers have been working hard ever since to undermine this design, but they have had limited success

Oversampling, or scaling down

Posted Nov 21, 2014 11:26 UTC (Fri) by epa (subscriber, #39769) [Link]

An HTML document is not a computer program and does not contain code for scaling glyphs or icons to a particular size. Firefox is, and contains well-written code for those tasks. You may as well say that plenty of legacy Microsoft Word documents can be scaled by fractional values when viewed in Microsoft Word.

Grab a PC running Windows 7 box and set it to 150% font scaling, then try a random assortment of third party software written before 2010. You will see the mess I am talking about - and I assume the situation with older Mac OS X programs is the same.

It is great that Firefox scales well and I quite agree it proves that scaling by an arbitrary factor is *not difficult*. That has very little bearing on whether existing programs, which exist in binary only form and cannot be modified, implement non-integer scaling reasonably. Even if 90% do so, ten per cent do not.

High-DPI displays and Linux

Posted Nov 12, 2014 15:50 UTC (Wed) by tsmithe (guest, #57598) [Link]

Would have been nice to see a little more assessment of other environments. I believe a lot of progress in Ubuntu and Unity has been made in the HiDPI direction.

High-DPI displays and Linux

Posted Nov 12, 2014 16:23 UTC (Wed) by proski (subscriber, #104) [Link] (5 responses)

Projectors were mentioned once in the story, which brings an interesting question. What is the DPI of a projector? Technically, it very high on the lens and very low on the screen. In practice, it should be something in between and it should vary dependent on the audience and the quality of the projector, the screen and the lighting.

Also, people with vision problems would probably want to see bigger fonts and pictures.

DPI is great for applications like on-screen rulers, but in most cases, scaling should be variable based on the user needs. DPI could provide a good default setting for newly connected hardware.

High-DPI displays and Linux

Posted Nov 12, 2014 17:54 UTC (Wed) by raven667 (subscriber, #5198) [Link]

> What is the DPI of a projector?

You can't get an exact answer on current equipment AFAIK, it is entirely dependent on how far you are projecting which determines the size of the resulting image, you should be able to guess an approximate range through based on how you expect a particular model to be used. To get a more specific answer you'd need a laser rangefinder or autofocus in the projector and send the calculated display geometry back to the computer, maybe you could even get away with instrumenting a manual focus.

High-DPI displays and Linux

Posted Nov 13, 2014 3:55 UTC (Thu) by flussence (guest, #85566) [Link] (3 responses)

Very good point, and a good reason why we shouldn't call it "DPI" at all, IMO. What we ought to be describing to the software is some number for perceived image clarity - the size of an individual pixel matters to that just as much as the user's eyesight or viewing conditions.

FWIW, I like E17's approach to this: it just presents a grid of scaling levels on first run, normalised around 1.0, and lets the user pick whatever's most comfortable. You have to go out of your way to ask for the DPI-based mode.

High-DPI displays and Linux

Posted Nov 13, 2014 13:19 UTC (Thu) by SLi (subscriber, #53131) [Link] (2 responses)

I think it should be called DPI, but it should be recognized that DPI is not the quantity we're ultimately interested in. Instead what is more interesting is the portion of the human field of view occupied.

For example, a comfortable human field of view is maybe around 155° horizontally and 135° vertically. If you fill this view with a line of 155 characters, each character has horizontal size of about 1°, and this is independent of whether you are talking about a monitor, a phone or a projector.

High-DPI displays and Linux

Posted Nov 13, 2014 18:13 UTC (Thu) by zev (subscriber, #88455) [Link] (1 responses)

Yes, dots-per-degree does seem like the more relevant unit -- though this also has the complication that most of the display devices we deal with are flat, not spheres centered on the viewer, so the number of field-of-view degrees per display-surface inch is different at the center of the display than at the edges. (Though I guess in hypothetical practice it probably wouldn't be enough difference to matter much, at least in "normal" cases.)

High-DPI displays and Linux

Posted Nov 13, 2014 19:28 UTC (Thu) by alexl (subscriber, #19068) [Link]

Nobody wants to specify actual measurements in their apps in units like "dots per degree", so instead what you use is something of the same kind of measure, but with a different scale. Its called a "reference pixel":

http://www.w3.org/TR/css3-values/#reference-pixel

The reference pixel is the visual angle of one pixel on a device with a
pixel density of 96dpi and a distance from the reader of an arm's length.
For a nominal arm's length of 28 inches, the visual angle is therefore
about 0.0213 degrees. For reading at arm's length, 1px thus corresponds to
about 0.26 mm (1/96 inch).

This is essentially what gnome uses. Following the recommendation above that link:

For lower-resolution devices, and devices with unusual viewing distances,
it is recommended instead that the anchor unit be the pixel unit. For
such devices it is recommended that the pixel unit refer to the whole
number of device pixels that best approximates the reference pixel.

High-DPI displays and Linux

Posted Nov 12, 2014 16:24 UTC (Wed) by drago01 (subscriber, #50715) [Link]

That real solution, it seems, almost has to involve divorcing applications from the individual pixels on the screen. Once applications are working in pixel-independent units almost all of the time (there may need to be exceptions for image editing, video playback, etc.), the underlying environment can work on rendering in a visually similar way on all possible screens.
That isn't really any different from "pixel scaling" ... whether the unit exposed to applications is cm, inches or frogs does not matter. We could as well just call it "pixels" and leave the actual rendering to the compositor which draws things on screen. On X11 the compositor only controls the output i.e what gets drawn on screen, input events get delivered directly to the applications, which means the scaling cannot be hidden from the applications. On wayland the compositor controls both input and output and therefore can hide the scaling completely from the applications.

High-DPI displays and Linux

Posted Nov 12, 2014 16:31 UTC (Wed) by daniels (subscriber, #16193) [Link] (12 responses)

For what it's worth, here's what Wayland does when high-DPI is active:
- applies downscaling to 'pixels', such that one pixel as seen by applications is really a n x n grid of pixels on-screen
- reports to each client that a particular output has a scaling factor
- upscales surfaces by the scaling factor by default, allowing naïve/simple/old clients to work by keeping them the same size as the screen would be ordinarily
- allows clients to indicate that their surface is pre-scaled, so more complex clients (anything using a real toolkit) can render at the full native resolution, and not require a scaling pass during composition
- has sub-pixel precision on input events, so no loss for either scaled or un-scaled clients

This is handled per-output, even having surfaces split across low- and high-DPI displays will result in them being shown at (roughly) the same physical size, rather than doubled or whatever.

High-DPI displays and Linux

Posted Nov 12, 2014 17:01 UTC (Wed) by smurf (subscriber, #17840) [Link] (9 responses)

Can Wayland do n x m grids?

It's a bit unfortunate that this still limits us to integer scaling factors between pixels and what-should-be-points, but I suppose that can't be helped as long as we still use low-DPI-only applications.

High-DPI displays and Linux

Posted Nov 12, 2014 17:17 UTC (Wed) by mpr22 (subscriber, #60784) [Link]

Oblong pixels are an abomination unto Nuggan :)

High-DPI displays and Linux

Posted Nov 12, 2014 18:43 UTC (Wed) by daniels (subscriber, #16193) [Link] (7 responses)

No, only integer factors which apply equally to both horizontal and vertical scaling. We had a long and tortured discussion about greater precision for scaling factors, and eventually decided to limit it to only integer scales, due to the substantial pain it would introduce otherwise. Others, e.g. OS X, have done the exact same thing, so it's good to know we've got company.

High-DPI displays and Linux

Posted Nov 12, 2014 19:45 UTC (Wed) by juliank (guest, #45896) [Link]

Needless to say, Chrome OS does fractional scaling (by default on some Chromebooks, I heard), although it is a bit broken right now. This makes the use of a 13 inch laptop with a 1920x1080 much more useful.

It does not allow any kind of fractional scaling, though, only 1.25 scaling is done right (crisp fonts) Other fractional scaling factors do not use any special mode and just scale up the rendered image, so fonts get blurry.

High-DPI displays and Linux

Posted Nov 13, 2014 3:23 UTC (Thu) by roc (subscriber, #30627) [Link] (2 responses)

That's a shame. Firefox, at least, is perfectly capable of rendering with an arbitrary scale.

High-DPI displays and Linux

Posted Nov 13, 2014 10:30 UTC (Thu) by daniels (subscriber, #16193) [Link] (1 responses)

In a slightly roundabout way, nothing actually precludes that.

Clients are fully in control of their surface size (x1 x y1), to which a scaling factor (n) is attached. So, if you want to render at full native resolution, then you attach a buffer of (nx1 x ny1).

Nothing is to say that you have to render at exactly scale n. If you wish to use your own scaling factor, then you can do just that: you have that full buffer size to paint _whatever you want_ into. So you're free to internally use a scaling factor of, say, 1.5 instead of 2.

As long as you apply the same scale to your input events (which have 8 bits of sub-pixel precision), then you can properly map that to whatever you've rendered.

High-DPI displays and Linux

Posted Nov 13, 2014 10:55 UTC (Thu) by roc (subscriber, #30627) [Link]

I don't see any reason to render other than "full native resolution".

But it does seem like there are screens where the ideal scaling is somewhere between 1 and 2.

High-DPI displays and Linux

Posted Nov 15, 2014 19:57 UTC (Sat) by javispedro (guest, #83660) [Link]

Windows does fractional scaling. To be honest it has the best scaling approach of every current OS implementation, as long as you ignore the baggage of Win32 applications....

High-DPI displays and Linux

Posted Nov 18, 2014 5:19 UTC (Tue) by quotemstr (subscriber, #45331) [Link] (1 responses)

> No, only integer factors which apply equally to both horizontal and vertical scaling.

Sorry, but that decision makes Wayland useless for me. Then again, I'll probably end up using Mir instead; I hope Mir's authors are more cognizant of actual user needs.

On my laptop, a Lenovo Carbon X1 2014 model, 1x is too small and 2x is too big. A scaling factor of 1.5 is perfect.

I read the mailing list thread. The argument about imprecise scaled graphics is bogus: Firefox manages to scale by non-integer factors *just fine*.

High-DPI displays and Linux

Posted Nov 18, 2014 9:08 UTC (Tue) by daniels (subscriber, #16193) [Link]

Presumably not with any GTK+ apps, as it only supports integer factors.

Anyway, like I said, application awareness can give you arbitrary factors anyway, as the whole point of the interface is to be able to opt out of scaling. At which point you have a full-size buffer.

High-DPI displays and Linux

Posted Nov 12, 2014 19:32 UTC (Wed) by dlang (guest, #313) [Link] (1 responses)

so this just throws away the extra resolution of the screen rather than scaling the text/icons to take advantage of them.

This is a useful stopgap, but not the right long-term solution.

High-DPI displays and Linux

Posted Nov 12, 2014 19:37 UTC (Wed) by daniels (subscriber, #16193) [Link]

Er, huh?

Let's say you have a 2560x1440 display at some ludicrously high DPI. Yes, we lie to apps and tell them that it's 1280x720. But we _also_ tell them that it's DPI-doubled, so _if they want_, they can render at the full resolution (2560x1440), and have that displayed, pixel-for-pixel, on screen. It's only the naïve apps that get scaled, so they don't have to explicitly have code to double every single part of their UI.

So it's not the perfect literally-resolution-independent utopia, but given that's never existed in practical form, I think I'll settle for the current model of allowing smart clients to not waste a single pixel, but not breaking others in the process.

High-DPI displays and Linux

Posted Nov 12, 2014 16:56 UTC (Wed) by xbobx (subscriber, #51363) [Link] (6 responses)

> The word is that this behavior is the results of limitations built deeply into the X Window System which, after all, was developed when 75dpi was a high-DPI screen.

I don't think that's quite accurate. The core X Windows protocol is actually much better about being DPI-agnostic than many other window systems, and many applications and toolkits do a decent job at scaling elements. Each X screen has an independent DPI, so you can simultaneously run two instances of the same application on separate X screens at different DPIs.

However, the years since the core protocol was created haven't been so favorable in this regard [1]. The biggest problem is that very few people use multiple X screens for multiple monitors. The most obvious reason for that is, admittedly, a protocol limitation: the number of X screens cannot change for the life of the display connection. So dynamically adding and removing monitors has generally been implemented within the confines of a single X screen; this means that DPI cannot differ between those monitors. This _could_ be solved by restarting your X server with a different configuration that exposes an X screen for each different monitor; this may be acceptable for the desktop where monitor configuration is fairly static, but isn't really feasible when plugging an external monitor or projector into a laptop. Windows also cannot be dragged from one X screen to another, which may annoy some users. However, dragging a window from a screen with one DPI to a screen with another DPI wouldn't really work in practice anyway; when the window is straddling the multiple screens, how should it render? Besides, there is no "DPI change notification event" to tell the app to re-render or resize itself.

[1] One possible reason is that the popularity of MS Windows largely drove available hardware features starting in the 1990s, and because MS Windows didn't deal with different DPIs very well, most hardware converged on a single DPI, or close to it.

High-DPI displays and Linux

Posted Nov 12, 2014 17:15 UTC (Wed) by epa (subscriber, #39769) [Link]

Apparently the new iMac (with a 200dpi ish display) does manage to work with lower-dpi external monitors and somehow redraw application windows at the appropriate size. I don't know how it does so; hardware scaling them by 0.5 would be the easiest way.

High-DPI displays and Linux

Posted Nov 13, 2014 3:24 UTC (Thu) by roc (subscriber, #30627) [Link] (4 responses)

Macs support moving windows across screens with different DPI. And Firefox supports that.

High-DPI displays and Linux

Posted Nov 13, 2014 6:16 UTC (Thu) by glandium (guest, #46059) [Link] (3 responses)

Windows supports that as well, but it's funky when a window is not entirely on one of the screens, in which case it uses the dpi of the screen where the window has the most surface. So while moving, the window size changes on both screens at some point. What do macs do?

High-DPI displays and Linux

Posted Nov 14, 2014 10:06 UTC (Fri) by roc (subscriber, #30627) [Link] (2 responses)

Macs window coordinates use a DPI-independent unit so windows don't change size as you move across screens with different DPI. The backing store for a window is resized when the majority of the window moves to a different screen.

High-DPI displays and Linux

Posted Nov 15, 2014 20:01 UTC (Sat) by javispedro (guest, #83660) [Link] (1 responses)

The same as Windows, except that OS X will refuse to render a window that is half-way between two monitors (unless it's currently being moved, in which case it just uses raster scaling). It will hide the window in every monitor except the one where most of its area is.

High-DPI displays and Linux

Posted Nov 16, 2014 12:35 UTC (Sun) by bronson (subscriber, #4806) [Link]

Not necessarily... Turn off "Displays have separate spaces" in the Mission Control preferences to have the traditional window-everywhere behavior.

High-DPI displays and Linux

Posted Nov 12, 2014 18:02 UTC (Wed) by b7j0c (guest, #27559) [Link]

jonathan -

1. what distro did you use?

2. as a subscriber to lwn, i am fine with you buying yourself a spiffy new laptop. no need to justify it!

High-DPI displays and Linux

Posted Nov 12, 2014 19:16 UTC (Wed) by josh (subscriber, #17465) [Link] (8 responses)

I have the same X1 Carbon. I found it much more satisfying to disable GNOME's autodetected 2x scaling (via the dconf setting mentioned in the article), and instead set *font* size scaling to 1.5x, as well as setting Firefox's devPixelsPerPx to 1.5. Doing so makes the 2560x1440 screen show the same amount of content on screen as a 1920x1080 screen would at 1x, rather than the 1280x720-equivalent that 2x would show.

High-DPI displays and Linux

Posted Nov 13, 2014 4:36 UTC (Thu) by ncm (guest, #165) [Link]

See, this is why I always take time to read the comments on LWN.

High-DPI displays and Linux

Posted Nov 13, 2014 11:52 UTC (Thu) by epa (subscriber, #39769) [Link] (1 responses)

If you set font size scaling to 1.5, how well is that handled by older X clients such as xterm?

High-DPI displays and Linux

Posted Nov 13, 2014 17:17 UTC (Thu) by josh (subscriber, #17465) [Link]

100% of what I run respects that setting, other than Firefox, which has its own setting.

xterm does not, but I don't run xterm. If you want apps like xterm to automatically scale, you'd have to set the X server DPI.

High-DPI displays and Linux

Posted Nov 15, 2014 20:04 UTC (Sat) by javispedro (guest, #83660) [Link] (3 responses)

I agree. It's absurd that Gtk+3/Gnome3 has decided on integer scaling ratios. Certainly OS X influence...

In fact, the above font DPI-only trick also works with Gtk+2 programs. The icons look smaller though. It looks much more reasonable in my Surface Pro than Gnome's/OSX idea of scaling everything to 2x and then back to 1.5x, which makes text look very fuzzy.

High-DPI displays and Linux

Posted Nov 15, 2014 20:15 UTC (Sat) by josh (subscriber, #17465) [Link] (2 responses)

> In fact, the above font DPI-only trick also works with Gtk+2 programs.

Works fine here with GTK+ 3 programs.

High-DPI displays and Linux

Posted Nov 18, 2014 0:09 UTC (Tue) by javispedro (guest, #83660) [Link] (1 responses)

Ah yes, I meant it works with Gtk+2 programs too.

High-DPI displays and Linux

Posted Nov 18, 2014 0:19 UTC (Tue) by josh (subscriber, #17465) [Link]

Sorry, misread your comment. :)

High-DPI displays and Linux

Posted Nov 17, 2014 9:25 UTC (Mon) by bernat (subscriber, #51658) [Link]

I am also doing that. Most applications render correctly when Xft.dpi is set to a sensible value (Chromium being an exception). This can be done either with the old way (echo Xft.dpi: 144 | xrdb -merge) or with the new way, through XSETTINGS.

The old way works without any additional daemon but is not dynamic. The application will read the settings from xrdb at start (or sometimes when told to create a new window, like this is done for Emacs).

The new way allows application to be notified when a change happens. It requires a XSETTINGS-compatible daemon, like xsettingsd (or gnome-preference-daemon). The correct DPI setting should be multiplied by 1024. XSETTINGS can also be per screen (with xsettingsd, this requires to run two instances).

I am using xrandr to set DPI settings correctly and propagate the result to xsettingsd with this ugly little script:
https://github.com/vincentbernat/awesome-configuration/bl...

Except Chromium, everything scales automatically: Emacs (GTK version, otherwise, not dynamic), GTK2 and GTK3 apps, libvte-based terminals, QT apps.

For Chromium, I just set to zoom settings to 150%. Work is in progress in Chromium to fix that correctly. Currently, there is a flag to compile HiDPI support but DPI is computed from the screen size instead of using the DPI set through XSETTINGS like other apps. Usually, this makes Chromium interface too big. And many widgets are broken. This is known and currently being fixed.

High-DPI displays and Linux

Posted Nov 13, 2014 4:19 UTC (Thu) by ldo (guest, #40946) [Link]

It is surprising to hear of Google having trouble with this, considering they developed a Linux-based platform, Android, that has such a sophisticated system built-in for coping with different DPI settings.

Android layouts are measured using a unit called “dp” or “dip” (device-independent pixels). This is nominally 1/160 inch, and fractional parts are allowed. Text should be measured with a separate unit, “sp”, which is equivalent to “dp” scaled by a user-specifiable text-size factor. This allows users to choose larger text for easier readability on a systemwide basis, rather than every app having to implement this option.

High-DPI displays and Linux

Posted Nov 13, 2014 10:39 UTC (Thu) by roskegg (subscriber, #105) [Link] (5 responses)

This is horrible. This was solved almost 30 years ago with "Display Postscript".

After all this systemd fuss, I've been seriously considering the following:

A new OS using the Linux kernel, because of its driver support. If I get too pissed off I'll try the OpenBSD kernel instead. There is precedent; Plan 9 was based on 2.9BSD, rather than SVR4.

However, the base system on top is based on

a) Plan 9 and Inferno

For the system layer, plumbing, init, basic OS interactions, and foundational UI via Rio

b) BeOS

Filesystem metadata/filesystem as database
Plus the architectural things that make for a speedy, smooth UI with glitch-free audio

c) OpenStep (aka NeXTSTEP)

Display PostScript (perhaps modernized to Display SVG)

NFS and X11 to go right in the compost heap.

High-DPI displays and Linux

Posted Nov 13, 2014 14:22 UTC (Thu) by giggls (subscriber, #48434) [Link] (3 responses)

Whats your Problem with NFS?

NFS4 is a decent remote filesystem, not "No File Security" anymore.

The only thing I would like to have in Linux Implementation would be a Shared Key Setup for environments with a couple of hosts, where a full-fedged Kerberos setup would be overkill.

Sven

High-DPI displays and Linux

Posted Nov 13, 2014 22:21 UTC (Thu) by roskegg (subscriber, #105) [Link] (2 responses)

9P2000 supersedes NFS.

High-DPI displays and Linux

Posted Nov 14, 2014 9:11 UTC (Fri) by giggls (subscriber, #48434) [Link] (1 responses)

Up till now, I have been using 9p for sharing drives on kvm hosts only.

Will 9p provide a decent solution for centralized home directories without the security nightmare of NFS3?

High-DPI displays and Linux

Posted Nov 15, 2014 16:25 UTC (Sat) by lsl (subscriber, #86508) [Link]

It could. I don't think the Linux kernel implementation of 9P (v9fs) really supports it, though.

On can certainly build nice things on top of 9P auth. See this paper on what is done on Plan 9:
http://plan9.bell-labs.com/sys/doc/auth.pdf

Think kerberized Unix services, but a thousand times simpler and actually unified on the system level.

High-DPI displays and Linux

Posted Nov 14, 2014 1:53 UTC (Fri) by lsl (subscriber, #86508) [Link]

> Plan 9 was based on 2.9BSD, rather than SVR4.

What? Plan 9 isn't based on any version of Unix. Some of its user space programs originated in late Research Unix, like the shell, rc(1), and the build tool mk(1). So did some other ideas, most likely.

I don't think there's a base for any statement such as the above, though. Where did you get this from?

High-DPI displays and Linux

Posted Nov 14, 2014 9:20 UTC (Fri) by jreznik (guest, #61949) [Link] (1 responses)

I've got X1 Carbon this week from my colleague who was very unhappy about it. HiDPI, ESC placement, missing insert. I don't mind ESC but insert... On the other hand I replaced my old T520 with X1 as my use case changed a lot. From T520 being powerful workstation for developer to meeting laptop. I ordered OneLink dock, so when docked, I'll be using keyboard (and thus ESC and instert) and for meetings, I don't mind too much. Etherpad is my only friend there :).

So far I got best results with Plasma 5, still, it needs quite a lot of tweaking and not everything there is dpi-independent. GNOME approach with scaling by factor of 2 is no-go. It would work for high resolution displays, X1's resolution is not that high and gives you very small screen with everything being so huge. Single head for a lot of apps is solvable. Somehow. Not perfect but it works.

The bigger issue is connecting second LCD, non HiDPI. In the end, I was able to find compromise font DPI settings that makes it somehow usable on both displays simultaneously but... On Carbon, it's a bit too small, on external LCD it's a bit too big. I can live with it, now. Xrandr scaling is unusable - too blurry. Firefox can be solved by https://addons.mozilla.org/en-US/firefox/addon/autohidpi/

High-DPI displays and Linux

Posted Nov 21, 2014 17:57 UTC (Fri) by josh (subscriber, #17465) [Link]

While it isn't overly convenient, you can actually press Insert on the X1 Carbon keyboard: Fn+I. You can also press Fn+S for SysRq, and Fn+T for Print Screen. (Documented in the user manual, along with several more.)

Personally, though, I only have one application that actually uses the Insert key: GCCG.

High-DPI displays and Linux

Posted Nov 21, 2014 17:38 UTC (Fri) by hamasaki (guest, #99927) [Link] (2 responses)

"Making applications work properly in an environment where displays can have widely varying densities — even on the same system — is not an easy problem to solve."

It's really not. The only challenge is agreeing on a system, which unfortunately is perhaps the area the free software community has the most trouble with.

"A fast rate of progress is arguably not surprising; after all, desktop developers hardly seem like a crowd that would be resistant to the allure of a beautiful new screen. So we can probably count on those developers to fix up the remaining problems in relatively short order."

Desktop developers are also the type who often stick to one system, and apparently we have at least 4 different approaches to solving this problem so far. The easy fixes have already been done. I'm not optimistic that the situation is going to significantly improve any time soon. It's not like the KDE developers are going to wake up and switch to Gnome's scheme when they have some free time next week. This isn't a technical problem that can be solved by a couple hours of debugging.

Apple has solved this problem because they can just declare the entire system by fiat. Microsoft is close behind, because they control most of the software and a little of the hardware. Free software does great when there's a BDFL like Linus, Guido, Matz, or Larry, but the desktop GUI has no such person. The closest we have is Ubuntu, and they're the only ones I think have any shot at fixing this in the next 5-10 years -- and they've chosen a 5th option, "Invent a new display server which is neither X11 or Wayland".

Opera for Linux has HiDPI support on Linux

Posted Dec 17, 2014 19:15 UTC (Wed) by Velmont (guest, #46433) [Link] (1 responses)

Since I saw you mention Firefox and Chromium, I thought it'd be good to mention that Chromium-engine based Opera has HiDPI support on Linux.

Opera for Linux just came out as stable (not only beta and developer stream): http://opera.com/

[I work at Opera]

Opera for Linux has HiDPI support on Linux

Posted Dec 17, 2014 19:19 UTC (Wed) by Velmont (guest, #46433) [Link]

Oh, that's embarrasing. It's been so long since I commented on LWN that I managed to reply to a comment instead of the article as I thought I was doing. *shrug*.

"native resolution" of photos from digital cameras ?

Posted Dec 22, 2014 20:48 UTC (Mon) by lmartelli (subscriber, #11755) [Link] (4 responses)

"Photos from digital cameras can be displayed at something resembling their native resolution"

Pardon me, but what is the "native resolution" of photos from digital cameras supposed to be ? Resolution, by definition, measures pixels or dots by a unit of distance. But photos from a digital camera have to physical dimensions, so they can't have a native resolution. Unless you are thinking of the resolution of the sensor, but since they are usually less than a centimeters, their resolution is much higher than even the highest-dpi display that I know of.

"native resolution" of photos from digital cameras ?

Posted Dec 23, 2014 19:16 UTC (Tue) by bfields (subscriber, #19510) [Link] (3 responses)

"Resolution, by definition, measures pixels or dots by a unit of distance."

It's also commonly used for total size in pixels. (E.g., "the resolution of my laptop's monitor is 1280x800").

So "native resolution" here means "full size in pixels".

Rail against loose use of language if you want, but I think that sort of usage is too common to exclude it as a definition of "resolution". And in this case there's no ambiguity (since as you point out a digital photo has no inherent physical dimensions).

"native resolution" of photos from digital cameras ?

Posted Dec 23, 2014 19:29 UTC (Tue) by dlang (guest, #313) [Link] (1 responses)

given that the most pixels you can get on a screen is ~8MP (for a 4k screen), you still aren't going to be showing your digital pictures at 1-1 zoom for most modern cameras

"native resolution" of photos from digital cameras ?

Posted May 19, 2015 20:36 UTC (Tue) by sethml (guest, #8471) [Link]

My 5K retina iMac shows 14.7MP. My Nikon D300 shoots 12.3MP. It's pretty cool seeing *all* the pixels at once. My more modern Nikon D750 shots 24.3MP, so I don't get to see all of them, but seeing more than half is still pretty awesome.

The world really needs >4K display support better standardized/supported so you don't have to buy Apple's hardware to get it...

"native resolution" of photos from digital cameras ?

Posted Dec 23, 2014 21:11 UTC (Tue) by rleigh (guest, #14622) [Link]

This is, strictly speaking, not correct. Like many terms (SI unit prefixes...), resolution is used largely incorrectly in the domain of computing and without understanding of what it really means. It's a term which has been taken and used out of context. Resolution is a physical measure of the smallest object an imaging system can *resolve*. If two small objects next to each other are not distinguishable as separate objects (they are one blob), then the system has not *resolved* them. In the case of microscopy and cameras, the resolution (resolving power) is defined by the optics of the system (lens numerical aperture, plus any further diffraction and aberation). This is *independent* of the detector (CCD/PMT/film), but the detector will have to sample at least at twice the bandwidth to satisfy Nyquist-Shannon sampling criterea, so for a correctly set up system your detector will be sampling at twice the optical resolution in x and y. [For cheap cameras with massive CCD sizes, the optics are so poor you end up sampling pointlessly at many times the Nyquist limit; turn on 2x2 or higher binning to get smaller, less noisy and higher quality images. Compare with an SLR with better optics and a smaller [pixel] size but higher quality CCD!]

http://en.wikipedia.org/wiki/Angular_resolution
http://www.svi.nl/NyquistRate

Resolution is not, and never has been, a *size* measure as used by CCDs and monitors. I know it's common practice in computing, but it's wrong nonetheless. You can measure the well/dot pitch (i.e. distance between pixels), which would be better, but strictly speaking that's not really a measure of resolution either (in this context) since it's a property of an optical system and not of the detector/emitter of a light signal such as a CCD or monitor.

Regards,
Roger


Copyright © 2014, Eklektix, Inc.
This article may be redistributed under the terms of the Creative Commons CC BY-SA 4.0 license
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds