High-DPI displays and Linux
There are certainly a lot of advantages to high-DPI displays (generally considered to be those with at least 200 pixels per linear inch). Text is rendered in an especially clear and readable fashion. Photos from digital cameras can be displayed at something resembling their native resolution. And, for those of us with sufficiently sharp eyes, high-DPI allows more work to be crammed onto a screen, increasing productivity. But, if the desktop environment and associated applications are not aware that a high-DPI screen is in use, the result can be unreadable text and unusable, microscopic screen elements.
High-DPI display support
So how have the Linux desktop environments responded to high-DPI displays? There appear to be two different basic approaches. The one used by GNOME and some web browsers is to hide the display resolution behind pixel scaling; in essence, the environment lies to applications, then turns each application pixel into a 2x2 array of screen pixels. It's a relatively quick solution, but, as we'll see, it leads to some unfortunate behaviors.
The alternative is to have applications deal with display-independent units (points, centimeters, inches, whatever) and hide the scaling required. KDE appears to have taken this approach. It seems to work well enough as long as applications play along, but here, too, the problem is not entirely solved.
Your editor's experiments have mostly been with GNOME, so this article will have an unavoidable bias in that direction. In GNOME, pixel scaling is managed with the inevitable gsettings registry value; to turn on 2-to-1 scaling, a command like this can be used:
gsettings set org.gnome.desktop.interface scaling-factor 2
Setting the scaling factor back to one turns off scaling. Scaling should be enabled by default if GNOME detects a high-DPI display; otherwise a command like the above can be used. This value can also be adjusted in gnome-tweak-tool. Should you run into bugs or weirdness (like only a quarter of the screen being used), setting the scaling factor to one then back to two will often straighten them out; your editor has ended up creating a fixscaling script for just this eventuality.
Firefox implements a similar sort of pixel scaling, but it doesn't seem to figure out when to use it on its own. The solution is to go into about:config and set the layout.css.devPixelsPerPx knob to a value like two. Alternatively, one can make things work by configuring the use of 30-point fonts, but that will make a mess of a lot of web sites.
In the KDE world, there seem to be a couple of parameters to set if the desktop doesn't figure things out on its own. The "application appearance" dialog has an option to set the dots-per-inch resolution used for font rendering; setting that to a value close to your screen's pixel density (or lower; 150 seems to be a recommended value) will make text readable. The size of icons is also nicely configurable in KDE; its use of SVG icons helps to make this option work well. Obviously, one will want to make icons larger on high-DPI displays.
Difficulties
So it seems that some support for high-DPI displays is in place for Linux. But there are still a lot of nagging little details — some perhaps not so little — that pop up when using a system with such a display.
For example, what happens when one plugs an external monitor (or projector) into such a system? That display almost certainly is not also high-DPI. The techniques used to make the high-DPI screen will have the effect of turning the lower-DPI screen into an extra-low-DPI screen. In your editor's case, this problem has made the use of external monitors on the new laptop nearly impossible. The word is that this behavior is the results of limitations built deeply into the X Window System which, after all, was developed when 75dpi was a high-DPI screen. Things will be much better under Wayland, we are told, but there doesn't appear to be a lot of relief in store before then.
Pixel scaling can also cause a lot of confusion with applications that are not prepared for it. Here is an old version of gthumb (the version shipped with Fedora 20) with scaling in effect:
With scaling turned off, the result is rather more satisfying:
(It should be noted that gthumb 3.14.x handles high-DPI displays in a much better manner. Your editor will reserve comment on some of the other user interface changes made during that time, though.)
Over time your editor has found a few applications that do better with pixel scaling turned off. Happily, there is an easy way to do that on a per-application basis: simply set the GDK_SCALE environment variable to the desired scaling value.
Another problem with pixel scaling is that it can cause even the high-DPI screen to appear to have a low pixel density. As a result, for example, gnome-terminal believes it's running with 7-point fonts, something that would normally be illegibly small. The sizing is confusing, but it also makes it hard to get that perfect size: seven points is too large, but a 6-point font, being quite a bit smaller, is illegible. All of this is somewhat unfortunate, since points are supposed to be real-world physical dimensions, but that connection has been lost for now.
There is no shortage of web sites that come through badly on high-DPI displays — including this one in many cases. User-interface elements can be so small as to be unusable, and images are reduced down to postage stamps. Xkcd becomes unreadable in such a setting. Turning on pixel scaling in Firefox makes a lot of these problems go away at the cost of losing a fair amount of screen space to the expanded icon bars at the top — and to larger advertisements. The Chromium browser, surprisingly, appears to have no high-DPI awareness at all at the moment; the sad result is that even Google sites can look terrible there.
In general, any web site that is trying to manage its page layout using pixel sizes is going to run into trouble on high-DPI displays. Fixing that problem will take a long time, and involves some interesting tradeoffs. The tiny-image problem can be fixed by shipping higher-resolution images and getting the browser to scale them to the desired size, but that can increase bandwidth consumption considerably. The alternative — sensing the screen resolution and shipping appropriately sized images — is a lot more work to support.
Some closing thoughts
Naturally, the new machine came with Windows preinstalled, so your editor had the chance to see how that system copes with the high-DPI display. The situation there is perhaps a bit better, but Windows has not fully solved the problem either. The basic system user interface works well, but web pages can run into trouble there, and some applications, even popular, high-profile applications, have not yet made the adjustment.
Making applications work properly in an environment where displays can have widely varying densities — even on the same system — is not an easy problem to solve. The good news is that we are much of the way there; the bad news is that there is still a lot to be done, and, in some cases, a change of approach may be needed. In particular, approaches based on pixel scaling have all the appearance of being a short-term kludge that gets things working while a real solution is being worked on.
That real solution, it seems, almost has to involve divorcing applications from the individual pixels on the screen. Once applications are working in pixel-independent units almost all of the time (there may need to be exceptions for image editing, video playback, etc.), the underlying environment can work on rendering in a visually similar way on all possible screens.
Until then, it is good to see that developers are working on making
high-DPI screens work better with Linux. A fast rate of progress is
arguably not surprising; after all, desktop developers hardly seem like a
crowd that would be resistant to the allure of a beautiful new screen. So
we can probably count on those developers to fix up the remaining problems
in relatively short order.
Posted Nov 12, 2014 15:31 UTC (Wed)
by epa (subscriber, #39769)
[Link] (27 responses)
You can also set this up on Windows with some Nvidia cards; again you render to a higher resolution and then the card scales it down for the display. This can be used by obsessive gamers to get the highest possible video quality (a bigger image scaled down will have, in effect, the best possible anti-aliasing) but could also be useful when your monitors have a mixture of dots-per-inch and you want things to appear roughly the same size on each.
Is this kind of oversampling supported on Linux? Could the editor's laptop 2560x1440 display be driven as a virtual 3840x2160 framebuffer and then scaled by 2/3 in hardware?
Posted Nov 12, 2014 15:47 UTC (Wed)
by juhah (subscriber, #32930)
[Link] (6 responses)
Posted Nov 12, 2014 17:41 UTC (Wed)
by grunch (subscriber, #16603)
[Link]
I had previously used the GNOME tweak tool to change the scaling factor to 2, but gave up too quickly when I encountered the huge magnification issue. And I hadn't even explored the use of xrandr to set the screen scale for the laptop display. Thanks for the enlightening article and comments: life's getting back to "normal" for me with this device!
Posted Nov 13, 2014 8:50 UTC (Thu)
by jengelh (guest, #33263)
[Link] (2 responses)
Posted Nov 13, 2014 12:41 UTC (Thu)
by tcourbon (guest, #60669)
[Link] (1 responses)
Posted Nov 15, 2014 18:03 UTC (Sat)
by jospoortvliet (guest, #33164)
[Link]
Posted Nov 13, 2014 10:01 UTC (Thu)
by arekm (guest, #4846)
[Link] (1 responses)
To "downgrade" 3200x1800 to 1920x1080 on dell xps 15 I need to use:
$ cat /etc/X11/kdm/Xsetup
/usr/bin/xrandr --dpi 141
Unfortunately:
so the only option now is to "downgrade" to 1920x1080
Posted Nov 13, 2014 16:20 UTC (Thu)
by xbobx (subscriber, #51363)
[Link]
I think Tk will correctly honor the DPI by default, but ignore DPI if a font size is a negative value. I know this because years ago after a distro update tkdiff started to render at different sizes on my different-DPI monitors, rather than scaling to approximately the same size. I tracked it down to a "bugfix" where somebody changed the default font size to a negative size in order to work around situations where the DPI was set incorrectly (sigh).
Posted Nov 12, 2014 16:42 UTC (Wed)
by ken (subscriber, #625)
[Link] (10 responses)
perhaps if you insist of running in 1920x1080 it renders in double and then scale back into 2880x1800. but that mode is not something I think anybody would actually want to use.
Posted Nov 12, 2014 17:10 UTC (Wed)
by epa (subscriber, #39769)
[Link] (9 responses)
I had assumed this was the default setup but that may not be the case, sorry.
Posted Nov 12, 2014 21:07 UTC (Wed)
by luto (guest, #39314)
[Link] (8 responses)
Posted Nov 13, 2014 2:14 UTC (Thu)
by danielkza (subscriber, #66161)
[Link] (5 responses)
Posted Nov 13, 2014 2:22 UTC (Thu)
by luto (guest, #39314)
[Link] (4 responses)
If an application thinks it's rendering a 100px x 100px window, then presumably all the drawing APIs need to act as though it really is rendering a 100px x 100px window. I assume that this means that applications render into an intermediate buffer, which is then scaled.
Now if an application wants to draw text, and that text will render at higher resolution, and the drawing API won't freak out about the pixels having subpixels, then I can imagine that rendering into a buffer with an integer multiple higher resolution could work, and that downscaling makes sense.
If not, then I still don't see why scaling 100x100 to 200x200 and then to 150x150 makes sense.
Posted Nov 13, 2014 2:35 UTC (Thu)
by danielkza (subscriber, #66161)
[Link] (3 responses)
Posted Nov 13, 2014 2:38 UTC (Thu)
by luto (guest, #39314)
[Link] (1 responses)
Posted Nov 13, 2014 15:41 UTC (Thu)
by zyga (subscriber, #81533)
[Link]
This is distinctively different from scaling individual drawing operations by 1.5
Posted Nov 15, 2014 20:11 UTC (Sat)
by javispedro (guest, #83660)
[Link]
On a more serious matter: With Wayland, Gtk+ becomes the owner of the surface, and thus the problem of how you actually render all of your non-integer cordinate widgets becomes Gtk's problem (or OpenGL's).
So this excuse no longer works in a Wayland scenario. Qt has non-integer scaling ratios and they work quite OK.
Posted Nov 13, 2014 11:46 UTC (Thu)
by epa (subscriber, #39769)
[Link]
An integer scaling factor of 2 is much easier to handle: an icon can be exactly pixel-doubled and still look clean. It won't be high res, of course, but at least a one-pixel line in the original will always be exactly two pixels wide in the scaled version. (By contrast, if you scale by 1.5 using a not-very-intelligent algorithm, a one-pixel line could end up either one or two pixels wide depending on its position in the image.)
Then once the application has rendered at 2x scaling, which most are able to manage at least passably, the GPU scales everything by 0.75 using a reasonably good scaling method which anti-aliases and doesn't suffer artefacts from rounding to exact pixel boundaries.
At least, the above is my conjecture based on the fact that Apple chose to do it this way (at least when a 'virtual res' of 1920x1200 is selected). It matches my experience using high-dpi displays in Windows, where a font scaling of 150% looks nasty in many apps but 200% works well enough.
Posted Nov 14, 2014 6:44 UTC (Fri)
by ploxiln (subscriber, #58395)
[Link]
So when they're scaled from 2x to 1.5x, they have extra detail in the 2x pixels, which is then combined/sampled for a much smoother 1.5x scale from the "reference pixel size".
It's a pretty clever way to get more applications to more easily support reasonably Hi-DPI display, IMHO. Just support 2x, and with all that detail the OS can scale that a bit and still have it look good.
Posted Nov 13, 2014 3:22 UTC (Thu)
by roc (subscriber, #30627)
[Link] (8 responses)
Posted Nov 13, 2014 11:46 UTC (Thu)
by alankila (guest, #47141)
[Link]
Posted Nov 13, 2014 11:51 UTC (Thu)
by epa (subscriber, #39769)
[Link] (6 responses)
However there are many legacy applications whose developers never thought about the issue of scaling (let alone by fractional amounts) and which were never tested under such a setup. It would be a mistake to think that you can set a factor of 1.5 and have all of them just work. Perhaps most would, but those that don't handle it properly will turn out fugly or totally unusable.
Therefore, the approach that maximizes compatibility with older software is always to scale by integer multiples. Even then, some older programs misrender (perhaps rendering one part of the window at scaled size while another part is unscaled), but most of the time you get a usable result.
Posted Nov 14, 2014 10:04 UTC (Fri)
by roc (subscriber, #30627)
[Link] (5 responses)
Posted Nov 18, 2014 5:32 UTC (Tue)
by quotemstr (subscriber, #45331)
[Link] (4 responses)
Fractional scaling in practice works fine; thanks for implementing it in Firefox. I hope the Wayland people come around: for some hardware (like mine), fractional scaling is precisely the right thing. Not everyone owns a Macbook.
If the Wayland developers persist in supporting only integer scaling factors, I suspect desktop environments will just hardcode 1x scaling and make toolkits do fractional scaling, which will be a mess all around.
Posted Nov 19, 2014 9:28 UTC (Wed)
by epa (subscriber, #39769)
[Link] (3 responses)
On Linux, the legacy programs may often be ones using X11 server-side fonts. Currently X ships with 75dpi and 100dpi bitmap fonts. If it included 150dpi and 200dpi sets, programs like xterm could use those and so scale up nicely.
Posted Nov 20, 2014 21:48 UTC (Thu)
by quotemstr (subscriber, #45331)
[Link] (2 responses)
Posted Nov 20, 2014 22:16 UTC (Thu)
by dlang (guest, #313)
[Link]
A lot of web designers have been working hard ever since to undermine this design, but they have had limited success
Posted Nov 21, 2014 11:26 UTC (Fri)
by epa (subscriber, #39769)
[Link]
Grab a PC running Windows 7 box and set it to 150% font scaling, then try a random assortment of third party software written before 2010. You will see the mess I am talking about - and I assume the situation with older Mac OS X programs is the same.
It is great that Firefox scales well and I quite agree it proves that scaling by an arbitrary factor is *not difficult*. That has very little bearing on whether existing programs, which exist in binary only form and cannot be modified, implement non-integer scaling reasonably. Even if 90% do so, ten per cent do not.
Posted Nov 12, 2014 15:50 UTC (Wed)
by tsmithe (guest, #57598)
[Link]
Posted Nov 12, 2014 16:23 UTC (Wed)
by proski (subscriber, #104)
[Link] (5 responses)
Also, people with vision problems would probably want to see bigger fonts and pictures.
DPI is great for applications like on-screen rulers, but in most cases, scaling should be variable based on the user needs. DPI could provide a good default setting for newly connected hardware.
Posted Nov 12, 2014 17:54 UTC (Wed)
by raven667 (subscriber, #5198)
[Link]
You can't get an exact answer on current equipment AFAIK, it is entirely dependent on how far you are projecting which determines the size of the resulting image, you should be able to guess an approximate range through based on how you expect a particular model to be used. To get a more specific answer you'd need a laser rangefinder or autofocus in the projector and send the calculated display geometry back to the computer, maybe you could even get away with instrumenting a manual focus.
Posted Nov 13, 2014 3:55 UTC (Thu)
by flussence (guest, #85566)
[Link] (3 responses)
FWIW, I like E17's approach to this: it just presents a grid of scaling levels on first run, normalised around 1.0, and lets the user pick whatever's most comfortable. You have to go out of your way to ask for the DPI-based mode.
Posted Nov 13, 2014 13:19 UTC (Thu)
by SLi (subscriber, #53131)
[Link] (2 responses)
For example, a comfortable human field of view is maybe around 155° horizontally and 135° vertically. If you fill this view with a line of 155 characters, each character has horizontal size of about 1°, and this is independent of whether you are talking about a monitor, a phone or a projector.
Posted Nov 13, 2014 18:13 UTC (Thu)
by zev (subscriber, #88455)
[Link] (1 responses)
Posted Nov 13, 2014 19:28 UTC (Thu)
by alexl (subscriber, #19068)
[Link]
http://www.w3.org/TR/css3-values/#reference-pixel
The reference pixel is the visual angle of one pixel on a device with a
This is essentially what gnome uses. Following the recommendation above that link:
For lower-resolution devices, and devices with unusual viewing distances,
Posted Nov 12, 2014 16:24 UTC (Wed)
by drago01 (subscriber, #50715)
[Link]
Posted Nov 12, 2014 16:31 UTC (Wed)
by daniels (subscriber, #16193)
[Link] (12 responses)
This is handled per-output, even having surfaces split across low- and high-DPI displays will result in them being shown at (roughly) the same physical size, rather than doubled or whatever.
Posted Nov 12, 2014 17:01 UTC (Wed)
by smurf (subscriber, #17840)
[Link] (9 responses)
It's a bit unfortunate that this still limits us to integer scaling factors between pixels and what-should-be-points, but I suppose that can't be helped as long as we still use low-DPI-only applications.
Posted Nov 12, 2014 17:17 UTC (Wed)
by mpr22 (subscriber, #60784)
[Link]
Posted Nov 12, 2014 18:43 UTC (Wed)
by daniels (subscriber, #16193)
[Link] (7 responses)
Posted Nov 12, 2014 19:45 UTC (Wed)
by juliank (guest, #45896)
[Link]
It does not allow any kind of fractional scaling, though, only 1.25 scaling is done right (crisp fonts) Other fractional scaling factors do not use any special mode and just scale up the rendered image, so fonts get blurry.
Posted Nov 13, 2014 3:23 UTC (Thu)
by roc (subscriber, #30627)
[Link] (2 responses)
Posted Nov 13, 2014 10:30 UTC (Thu)
by daniels (subscriber, #16193)
[Link] (1 responses)
Clients are fully in control of their surface size (x1 x y1), to which a scaling factor (n) is attached. So, if you want to render at full native resolution, then you attach a buffer of (nx1 x ny1).
Nothing is to say that you have to render at exactly scale n. If you wish to use your own scaling factor, then you can do just that: you have that full buffer size to paint _whatever you want_ into. So you're free to internally use a scaling factor of, say, 1.5 instead of 2.
As long as you apply the same scale to your input events (which have 8 bits of sub-pixel precision), then you can properly map that to whatever you've rendered.
Posted Nov 13, 2014 10:55 UTC (Thu)
by roc (subscriber, #30627)
[Link]
But it does seem like there are screens where the ideal scaling is somewhere between 1 and 2.
Posted Nov 15, 2014 19:57 UTC (Sat)
by javispedro (guest, #83660)
[Link]
Posted Nov 18, 2014 5:19 UTC (Tue)
by quotemstr (subscriber, #45331)
[Link] (1 responses)
Sorry, but that decision makes Wayland useless for me. Then again, I'll probably end up using Mir instead; I hope Mir's authors are more cognizant of actual user needs.
On my laptop, a Lenovo Carbon X1 2014 model, 1x is too small and 2x is too big. A scaling factor of 1.5 is perfect.
I read the mailing list thread. The argument about imprecise scaled graphics is bogus: Firefox manages to scale by non-integer factors *just fine*.
Posted Nov 18, 2014 9:08 UTC (Tue)
by daniels (subscriber, #16193)
[Link]
Anyway, like I said, application awareness can give you arbitrary factors anyway, as the whole point of the interface is to be able to opt out of scaling. At which point you have a full-size buffer.
Posted Nov 12, 2014 19:32 UTC (Wed)
by dlang (guest, #313)
[Link] (1 responses)
This is a useful stopgap, but not the right long-term solution.
Posted Nov 12, 2014 19:37 UTC (Wed)
by daniels (subscriber, #16193)
[Link]
Let's say you have a 2560x1440 display at some ludicrously high DPI. Yes, we lie to apps and tell them that it's 1280x720. But we _also_ tell them that it's DPI-doubled, so _if they want_, they can render at the full resolution (2560x1440), and have that displayed, pixel-for-pixel, on screen. It's only the naïve apps that get scaled, so they don't have to explicitly have code to double every single part of their UI.
So it's not the perfect literally-resolution-independent utopia, but given that's never existed in practical form, I think I'll settle for the current model of allowing smart clients to not waste a single pixel, but not breaking others in the process.
Posted Nov 12, 2014 16:56 UTC (Wed)
by xbobx (subscriber, #51363)
[Link] (6 responses)
I don't think that's quite accurate. The core X Windows protocol is actually much better about being DPI-agnostic than many other window systems, and many applications and toolkits do a decent job at scaling elements. Each X screen has an independent DPI, so you can simultaneously run two instances of the same application on separate X screens at different DPIs.
However, the years since the core protocol was created haven't been so favorable in this regard [1]. The biggest problem is that very few people use multiple X screens for multiple monitors. The most obvious reason for that is, admittedly, a protocol limitation: the number of X screens cannot change for the life of the display connection. So dynamically adding and removing monitors has generally been implemented within the confines of a single X screen; this means that DPI cannot differ between those monitors. This _could_ be solved by restarting your X server with a different configuration that exposes an X screen for each different monitor; this may be acceptable for the desktop where monitor configuration is fairly static, but isn't really feasible when plugging an external monitor or projector into a laptop. Windows also cannot be dragged from one X screen to another, which may annoy some users. However, dragging a window from a screen with one DPI to a screen with another DPI wouldn't really work in practice anyway; when the window is straddling the multiple screens, how should it render? Besides, there is no "DPI change notification event" to tell the app to re-render or resize itself.
[1] One possible reason is that the popularity of MS Windows largely drove available hardware features starting in the 1990s, and because MS Windows didn't deal with different DPIs very well, most hardware converged on a single DPI, or close to it.
Posted Nov 12, 2014 17:15 UTC (Wed)
by epa (subscriber, #39769)
[Link]
Posted Nov 13, 2014 3:24 UTC (Thu)
by roc (subscriber, #30627)
[Link] (4 responses)
Posted Nov 13, 2014 6:16 UTC (Thu)
by glandium (guest, #46059)
[Link] (3 responses)
Posted Nov 14, 2014 10:06 UTC (Fri)
by roc (subscriber, #30627)
[Link] (2 responses)
Posted Nov 15, 2014 20:01 UTC (Sat)
by javispedro (guest, #83660)
[Link] (1 responses)
Posted Nov 16, 2014 12:35 UTC (Sun)
by bronson (subscriber, #4806)
[Link]
Posted Nov 12, 2014 18:02 UTC (Wed)
by b7j0c (guest, #27559)
[Link]
1. what distro did you use?
2. as a subscriber to lwn, i am fine with you buying yourself a spiffy new laptop. no need to justify it!
Posted Nov 12, 2014 19:16 UTC (Wed)
by josh (subscriber, #17465)
[Link] (8 responses)
Posted Nov 13, 2014 4:36 UTC (Thu)
by ncm (guest, #165)
[Link]
Posted Nov 13, 2014 11:52 UTC (Thu)
by epa (subscriber, #39769)
[Link] (1 responses)
Posted Nov 13, 2014 17:17 UTC (Thu)
by josh (subscriber, #17465)
[Link]
xterm does not, but I don't run xterm. If you want apps like xterm to automatically scale, you'd have to set the X server DPI.
Posted Nov 15, 2014 20:04 UTC (Sat)
by javispedro (guest, #83660)
[Link] (3 responses)
In fact, the above font DPI-only trick also works with Gtk+2 programs. The icons look smaller though. It looks much more reasonable in my Surface Pro than Gnome's/OSX idea of scaling everything to 2x and then back to 1.5x, which makes text look very fuzzy.
Posted Nov 15, 2014 20:15 UTC (Sat)
by josh (subscriber, #17465)
[Link] (2 responses)
Works fine here with GTK+ 3 programs. Posted Nov 17, 2014 9:25 UTC (Mon)
by bernat (subscriber, #51658)
[Link]
The old way works without any additional daemon but is not dynamic. The application will read the settings from xrdb at start (or sometimes when told to create a new window, like this is done for Emacs).
The new way allows application to be notified when a change happens. It requires a XSETTINGS-compatible daemon, like xsettingsd (or gnome-preference-daemon). The correct DPI setting should be multiplied by 1024. XSETTINGS can also be per screen (with xsettingsd, this requires to run two instances).
I am using xrandr to set DPI settings correctly and propagate the result to xsettingsd with this ugly little script:
Except Chromium, everything scales automatically: Emacs (GTK version, otherwise, not dynamic), GTK2 and GTK3 apps, libvte-based terminals, QT apps.
For Chromium, I just set to zoom settings to 150%. Work is in progress in Chromium to fix that correctly. Currently, there is a flag to compile HiDPI support but DPI is computed from the screen size instead of using the DPI set through XSETTINGS like other apps. Usually, this makes Chromium interface too big. And many widgets are broken. This is known and currently being fixed.
Posted Nov 13, 2014 4:19 UTC (Thu)
by ldo (guest, #40946)
[Link]
Android layouts are measured using a unit called “dp” or “dip” (device-independent pixels). This is nominally 1/160 inch, and fractional parts are allowed. Text should be measured with a separate unit, “sp”, which is equivalent to “dp” scaled by a user-specifiable text-size factor. This allows users to choose larger text for easier readability on a systemwide basis, rather than every app having to implement this option.
Posted Nov 13, 2014 10:39 UTC (Thu)
by roskegg (subscriber, #105)
[Link] (5 responses)
After all this systemd fuss, I've been seriously considering the following:
A new OS using the Linux kernel, because of its driver support. If I get too pissed off I'll try the OpenBSD kernel instead. There is precedent; Plan 9 was based on 2.9BSD, rather than SVR4.
However, the base system on top is based on
a) Plan 9 and Inferno
For the system layer, plumbing, init, basic OS interactions, and foundational UI via Rio
b) BeOS
Filesystem metadata/filesystem as database
c) OpenStep (aka NeXTSTEP)
Display PostScript (perhaps modernized to Display SVG)
NFS and X11 to go right in the compost heap.
Posted Nov 13, 2014 14:22 UTC (Thu)
by giggls (subscriber, #48434)
[Link] (3 responses)
NFS4 is a decent remote filesystem, not "No File Security" anymore.
The only thing I would like to have in Linux Implementation would be a Shared Key Setup for environments with a couple of hosts, where a full-fedged Kerberos setup would be overkill.
Sven
Posted Nov 13, 2014 22:21 UTC (Thu)
by roskegg (subscriber, #105)
[Link] (2 responses)
Posted Nov 14, 2014 9:11 UTC (Fri)
by giggls (subscriber, #48434)
[Link] (1 responses)
Will 9p provide a decent solution for centralized home directories without the security nightmare of NFS3?
Posted Nov 15, 2014 16:25 UTC (Sat)
by lsl (subscriber, #86508)
[Link]
On can certainly build nice things on top of 9P auth. See this paper on what is done on Plan 9:
Think kerberized Unix services, but a thousand times simpler and actually unified on the system level.
Posted Nov 14, 2014 1:53 UTC (Fri)
by lsl (subscriber, #86508)
[Link]
What? Plan 9 isn't based on any version of Unix. Some of its user space programs originated in late Research Unix, like the shell, rc(1), and the build tool mk(1). So did some other ideas, most likely.
I don't think there's a base for any statement such as the above, though. Where did you get this from?
Posted Nov 14, 2014 9:20 UTC (Fri)
by jreznik (guest, #61949)
[Link] (1 responses)
So far I got best results with Plasma 5, still, it needs quite a lot of tweaking and not everything there is dpi-independent. GNOME approach with scaling by factor of 2 is no-go. It would work for high resolution displays, X1's resolution is not that high and gives you very small screen with everything being so huge. Single head for a lot of apps is solvable. Somehow. Not perfect but it works.
The bigger issue is connecting second LCD, non HiDPI. In the end, I was able to find compromise font DPI settings that makes it somehow usable on both displays simultaneously but... On Carbon, it's a bit too small, on external LCD it's a bit too big. I can live with it, now. Xrandr scaling is unusable - too blurry. Firefox can be solved by https://addons.mozilla.org/en-US/firefox/addon/autohidpi/
Posted Nov 21, 2014 17:57 UTC (Fri)
by josh (subscriber, #17465)
[Link]
Personally, though, I only have one application that actually uses the Insert key: GCCG.
Posted Nov 21, 2014 17:38 UTC (Fri)
by hamasaki (guest, #99927)
[Link] (2 responses)
It's really not. The only challenge is agreeing on a system, which unfortunately is perhaps the area the free software community has the most trouble with.
"A fast rate of progress is arguably not surprising; after all, desktop developers hardly seem like a crowd that would be resistant to the allure of a beautiful new screen. So we can probably count on those developers to fix up the remaining problems in relatively short order."
Desktop developers are also the type who often stick to one system, and apparently we have at least 4 different approaches to solving this problem so far. The easy fixes have already been done. I'm not optimistic that the situation is going to significantly improve any time soon. It's not like the KDE developers are going to wake up and switch to Gnome's scheme when they have some free time next week. This isn't a technical problem that can be solved by a couple hours of debugging.
Apple has solved this problem because they can just declare the entire system by fiat. Microsoft is close behind, because they control most of the software and a little of the hardware. Free software does great when there's a BDFL like Linus, Guido, Matz, or Larry, but the desktop GUI has no such person. The closest we have is Ubuntu, and they're the only ones I think have any shot at fixing this in the next 5-10 years -- and they've chosen a 5th option, "Invent a new display server which is neither X11 or Wayland".
Posted Dec 17, 2014 19:15 UTC (Wed)
by Velmont (guest, #46433)
[Link] (1 responses)
Opera for Linux just came out as stable (not only beta and developer stream): http://opera.com/
[I work at Opera]
Posted Dec 17, 2014 19:19 UTC (Wed)
by Velmont (guest, #46433)
[Link]
Posted Dec 22, 2014 20:48 UTC (Mon)
by lmartelli (subscriber, #11755)
[Link] (4 responses)
Pardon me, but what is the "native resolution" of photos from digital cameras supposed to be ? Resolution, by definition, measures pixels or dots by a unit of distance. But photos from a digital camera have to physical dimensions, so they can't have a native resolution. Unless you are thinking of the resolution of the sensor, but since they are usually less than a centimeters, their resolution is much higher than even the highest-dpi display that I know of.
Posted Dec 23, 2014 19:16 UTC (Tue)
by bfields (subscriber, #19510)
[Link] (3 responses)
It's also commonly used for total size in pixels. (E.g., "the resolution of my laptop's monitor is 1280x800").
So "native resolution" here means "full size in pixels".
Rail against loose use of language if you want, but I think that sort of usage is too common to exclude it as a definition of "resolution". And in this case there's no ambiguity (since as you point out a digital photo has no inherent physical dimensions).
Posted Dec 23, 2014 19:29 UTC (Tue)
by dlang (guest, #313)
[Link] (1 responses)
Posted May 19, 2015 20:36 UTC (Tue)
by sethml (guest, #8471)
[Link]
The world really needs >4K display support better standardized/supported so you don't have to buy Apple's hardware to get it...
Posted Dec 23, 2014 21:11 UTC (Tue)
by rleigh (guest, #14622)
[Link]
http://en.wikipedia.org/wiki/Angular_resolution
Resolution is not, and never has been, a *size* measure as used by CCDs and monitors. I know it's common practice in computing, but it's wrong nonetheless. You can measure the well/dot pitch (i.e. distance between pixels), which would be better, but strictly speaking that's not really a measure of resolution either (in this context) since it's a property of an optical system and not of the detector/emitter of a light signal such as a CCD or monitor.
Regards,
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
#! /bin/sh
# Xsetup - run as root before the login dialog appears
/usr/bin/xrandr --output eDP1 --scale 0.6x0.6
- tk doesn't suppor HiDPI at all (http://core.tcl.tk/tk/tktview/0917069c05e9c354a27c8105a7a...)
- same for GRUB2 (http://savannah.gnu.org/bugs/?42525)
- same for google-chrome/chromium-browser (https://code.google.com/p/chromium/issues/detail?id=143619)
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Thank you. It's frustrating how some people say that "$X is impossible!", and then when confronted with an existence proof of $X, just reiterate the claim that "$X is impossible!".
Oversampling, or scaling down
I think the point is that there's an existence proof of many legacy programs from the 2000s and 1990s which produce a nasty-looking mess when asked to scale by 3/2 (let alone, say, 4/3). That does not negate the existence of a large body of carefully written programs (of which Firefox is one) which handle arbitrary scaling perfectly. But it is (I presume) the reason why Apple chose to scale by an integer factor and then resize in the GPU if necessary. I quite agree, if starting from scratch then arbitrary dpi need to be supported.
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
Oversampling, or scaling down
High-DPI displays and Linux
Projectors were mentioned once in the story, which brings an interesting question. What is the DPI of a projector? Technically, it very high on the lens and very low on the screen. In practice, it should be something in between and it should vary dependent on the audience and the quality of the projector, the screen and the lighting.
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
pixel density of 96dpi and a distance from the reader of an arm's length.
For a nominal arm's length of 28 inches, the visual angle is therefore
about 0.0213 degrees. For reading at arm's length, 1px thus corresponds to
about 0.26 mm (1/96 inch).
it is recommended instead that the anchor unit be the pixel unit. For
such devices it is recommended that the pixel unit refer to the whole
number of device pixels that best approximates the reference pixel.
High-DPI displays and Linux
That real solution, it seems, almost has to involve divorcing applications from the individual pixels on the screen. Once applications are working in pixel-independent units almost all of the time (there may need to be exceptions for image editing, video playback, etc.), the underlying environment can work on rendering in a visually similar way on all possible screens.
That isn't really any different from "pixel scaling" ... whether the unit exposed to applications is cm, inches or frogs does not matter. We could as well just call it "pixels" and leave the actual rendering to the compositor which draws things on screen. On X11 the compositor only controls the output i.e what gets drawn on screen, input events get delivered directly to the applications, which means the scaling cannot be hidden from the applications. On wayland the compositor controls both input and output and therefore can hide the scaling completely from the applications.
High-DPI displays and Linux
- applies downscaling to 'pixels', such that one pixel as seen by applications is really a n x n grid of pixels on-screen
- reports to each client that a particular output has a scaling factor
- upscales surfaces by the scaling factor by default, allowing naïve/simple/old clients to work by keeping them the same size as the screen would be ordinarily
- allows clients to indicate that their surface is pre-scaled, so more complex clients (anything using a real toolkit) can render at the full native resolution, and not require a scaling pass during composition
- has sub-pixel precision on input events, so no loss for either scaled or un-scaled clients
High-DPI displays and Linux
Oblong pixels are an abomination unto Nuggan :)
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
https://github.com/vincentbernat/awesome-configuration/bl...
High-DPI displays and Linux
High-DPI displays and Linux
Plus the architectural things that make for a speedy, smooth UI with glitch-free audio
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
http://plan9.bell-labs.com/sys/doc/auth.pdf
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
High-DPI displays and Linux
Opera for Linux has HiDPI support on Linux
Opera for Linux has HiDPI support on Linux
"native resolution" of photos from digital cameras ?
"native resolution" of photos from digital cameras ?
"native resolution" of photos from digital cameras ?
"native resolution" of photos from digital cameras ?
"native resolution" of photos from digital cameras ?
http://www.svi.nl/NyquistRate
Roger
