LWN: Comments on "COSMIC desktop makes its debut" https://lwn.net/Articles/984638/ This is a special feed containing comments posted to the individual LWN article titled "COSMIC desktop makes its debut". en-us Fri, 29 Aug 2025 09:09:28 +0000 Fri, 29 Aug 2025 09:09:28 +0000 https://www.rssboard.org/rss-specification lwn@lwn.net I can't understand the logic for this https://lwn.net/Articles/986916/ https://lwn.net/Articles/986916/ mwilck <div class="FormattedComment"> <span class="QuotedText">&gt; The choice was to allow extensions full power to change the desktop in all sorts of ways,</span><br> <p> Was this done out of kindness for extension authors? I doubt it. I believe the reason was that plain GNOME was a pain, and extensions came to the rescue. If there'd been no extensions, or if the set of things that extensions could do had been severely limited, GNOME 3 would have been much less of a success.<br> <p> <span class="QuotedText">&gt; My understanding is that the community, including the extension authors, greatly preferred the first approach—which is what Gnome does. </span><br> <p> Did extension writers actually have a say in this discussion? Well, a few of them, who were around 13y ago, perhaps. Many of which have probably given up by now. I wonder if you've been trying to maintain any GNOME extensions through the last decade. I did, and still do, and I can tell that it's no fun.<br> </div> Thu, 22 Aug 2024 21:19:18 +0000 I can't understand the logic for this https://lwn.net/Articles/986763/ https://lwn.net/Articles/986763/ DOT <div class="FormattedComment"> It's not all that bad, really. Yes, scripting support is mandatory for the proper viewing of web pages, but there is still the DOM lingua franca of basic display primitives, which does give tremendous accessibility benefits over a black-box OpenGL surface rendered at 60 FPS.<br> <p> Any new desktop GUI library has to do the hard work of making their widgets available to accessibility tooling. On the web, this is much easier: you get most of it for free by using HTML tags, and you can easily improve on it by using ARIA attributes.<br> <p> Of course, we can imagine whole websites being created as a single canvas element drawn by a WASM binary. That's not very popular though, because the DOM already gives you the primitives you need.<br> </div> Thu, 22 Aug 2024 10:04:51 +0000 Display Scaling https://lwn.net/Articles/986311/ https://lwn.net/Articles/986311/ AdamW <div class="FormattedComment"> You can still set a custom DPI if you like. In GNOME you run Tweak Tool, go to Fonts, and set a "Scaling Factor", which really just changes the DPI (it's multiplied by 96). I don't know where it is in KDE, but you can do it. It still "works" like it always did, which is to say, it *kinda* works. See other comments for all the things it doesn't do.<br> </div> Tue, 20 Aug 2024 01:14:30 +0000 I can't understand the logic for this https://lwn.net/Articles/986272/ https://lwn.net/Articles/986272/ excors <div class="FormattedComment"> <span class="QuotedText">&gt; Firefox was not a total rewrite, it was more of a refactor where they extracted only the browser-specific parts from the Netscape suite.</span><br> <p> I meant specifically that the Mozilla Application Suite, which was the basis of Netscape 6 (when Spolsky wrote the article) and which Firefox was later extracted from, is the project that chose (in 1998) to do the rewrite he's referring to.<br> <p> In particular the Gecko layout engine was new, which "constituted an almost-total rewrite of the browser ... Now we had to rewrite the entire user interface from scratch before anyone could even browse the web, or add a bookmark" (<a href="https://www.jwz.org/gruntle/nomo.html">https://www.jwz.org/gruntle/nomo.html</a>), so Firefox is directly derived from that completely rewritten browser.<br> </div> Mon, 19 Aug 2024 18:46:03 +0000 I can't understand the logic for this https://lwn.net/Articles/986268/ https://lwn.net/Articles/986268/ Cyberax <div class="FormattedComment"> <span class="QuotedText">&gt; The rewrite that he's complaining about became Firefox</span><br> <p> Firefox was not a total rewrite, it was more of a refactor where they extracted only the browser-specific parts from the Netscape suite.<br> </div> Mon, 19 Aug 2024 17:02:18 +0000 I can't understand the logic for this https://lwn.net/Articles/986265/ https://lwn.net/Articles/986265/ excors <div class="FormattedComment"> I'm not sure that's a particularly convincing article. He says it's harder to read code than to write it - but then he goes on to say the old code has had years of testing and bug fixing. He's arguing it was surprisingly hard to write, and it'll be just as hard to write a new version that's just as good, so it's always better to refactor the old version than to throw it away. People mistakenly embark on a rewrite because they can easily read the old code and recognise that it's a mess (and they're often correct it's a mess, as with the architectural problems he mentions), while underestimating the effort involved in (re)writing it.<br> <p> In short, he says writing code is really hard, which is the exact opposite of the "fundamental law" he just proposed.<br> <p> He also says rewriting code from scratch is the single worst strategic mistake you can make - and then in the very next paragraph he says Microsoft tried to rewrite Word from scratch, but it explicitly was not a strategic mistake because they kept developing the old version in parallel. Really he should say the strategic mistake is "don't completely stop developing your publicly-available product for three years in a rapidly-advancing competitive market" (which seems rather obvious).<br> <p> Also his specific example is the rewrite of Netscape. If I'm not getting the versions mixed up, the original Netscape was developed from 1994 to about 1997, by which point "The consensus seems to be that the old Netscape code base was _really_ bad" (according to Spolsky). The rewrite that he's complaining about became Firefox, which achieved substantial market share and has remained technically competitive for two decades, with huge advances in standards compliance and features and performance and security etc. Could they have had that long-term success if they'd stuck with the Netscape 4 codebase and hadn't taken the short-term hit of paying off technical debt? Obviously it didn't work out for Netscape as a company, but the software looks like a success story.<br> <p> (To be clear, I'm not saying rewriting software is always (or often) a good idea. But Spolsky phrased it in absolute terms as "Things You Should Never Do" with no nuance, which I think is simply wrong, and his arguments contradict his claims.)<br> </div> Mon, 19 Aug 2024 16:58:29 +0000 I can't understand the logic for this https://lwn.net/Articles/986238/ https://lwn.net/Articles/986238/ wazoox <div class="FormattedComment"> If you prefer a more sophisticated take on this (but just as old as jwz's one):<br> <p> <a href="https://www.joelonsoftware.com/2000/04/06/things-you-should-never-do-part-i/">https://www.joelonsoftware.com/2000/04/06/things-you-shou...</a><br> <p> <span class="QuotedText">&gt; There’s a subtle reason that programmers always want to throw away the code and start over. The reason is that they think the old code is a mess. And here is the interesting observation: they are probably wrong. The reason that they think the old code is a mess is because of a cardinal, fundamental law of programming:</span><br> <span class="QuotedText">&gt;</span><br> <span class="QuotedText">&gt; It’s harder to read code than to write it.</span><br> <span class="QuotedText">&gt;</span><br> <span class="QuotedText">&gt; This is why code reuse is so hard. This is why everybody on your team has a different function they like to use for splitting strings into arrays of strings. They write their own function because it’s easier and more fun than figuring out how the old function works.</span><br> </div> Mon, 19 Aug 2024 15:03:37 +0000 I can't understand the logic for this https://lwn.net/Articles/986182/ https://lwn.net/Articles/986182/ ssmith32 <div class="FormattedComment"> I know you said not "purely".. but.. <br> <p> React is not a declarative model, even moderately. Maybe slightly if you squint really hard after too many drinks? But it's mostly *not* declarative.<br> <p> It's very much an imperative language that lets you build components.. similar to how most object-oriented UI frameworks work. If you stretch the definition of declarative enough, maybe? But then GTK would be considered declarative...<br> <p> But you very much tell it to<br> <p> Build component A, <br> then connect it to component B, <br> when you get signal X, then update Y, etc.<br> <p> Step by step, in an imperative fashion.<br> <p> Having abstractions and components != declarative.<br> <p> SQL is probably the closest thing to declarative that most folks are familiar, and it's nothing like building a React app, and is absolutely *atrocious* for building large apps out of. Large SQL queries are usually much less maintainable than code of any sort of a similar size.<br> <p> Prolog is also much closer to declarative than React. And is also absolutely horrible for large programs.<br> <p> A declarative language for UIs would let you declare what you want to accomplish, literally, something like: "A UI to let me update the time and timezone on COSMIC"... and then the optimizer goes off and builds that UI.<br> <p> React, you have to tell it exactly what you want, and, most importantly, how to handle input, step by step. That's imperative.<br> </div> Mon, 19 Aug 2024 05:04:04 +0000 I can't understand the logic for this https://lwn.net/Articles/986180/ https://lwn.net/Articles/986180/ ssmith32 <div class="FormattedComment"> No statement was made that Option #1 was ruled out because of the amount of work.<br> <p> It was that building on an unstable base meant that they'd be faced with:<br> <p> a) breaking their desktop<br> b) holding back an update of GNOME, until they can migrate... which with GNOME, may essentially involve a complete rewrite.<br> <p> And re-writing from scratch is can certainly be easier than maintaining a fork. Particularly if upstream has no reason to not maintain something(s) your patches depend on, leaving you back to the dilemma of Option #1.<br> <p> Some of the hardest things I've had to work on involve large, undocumented legacy codebases. And I've helped build systems from scratch more complex than those legacy codebases.<br> </div> Mon, 19 Aug 2024 04:45:59 +0000 I can't understand the logic for this https://lwn.net/Articles/986020/ https://lwn.net/Articles/986020/ bbbush <div class="FormattedComment"> Source code is a liability. By inheriting technical debt, the revolving debt would have a lower credit limit.<br> </div> Fri, 16 Aug 2024 15:06:36 +0000 I can't understand the logic for this https://lwn.net/Articles/985874/ https://lwn.net/Articles/985874/ yeltsin <div class="FormattedComment"> And then, on top of said imperative foundation, people invent and widely use libraries/frameworks like React that provide a (not purely) declarative model for describing the user interface (or more precisely, for deriving the user interface from the underlying data). Because the imperative model is so difficult to work with in large applications. We've come full circle.<br> </div> Thu, 15 Aug 2024 19:01:26 +0000 Display Scaling https://lwn.net/Articles/985868/ https://lwn.net/Articles/985868/ atnot <div class="FormattedComment"> <span class="QuotedText">&gt; Would you mind to elaborate? I'm really asking for my education, not to imply that anyone working on this back then or today isn't doing the best that's possible.</span><br> <p> Oh, there's an unbelievable list of problems once you start even thinking about doing it properly. The sibling comment goes into some of the compositor problems, but it's just the tip of the iceberg.<br> <p> So here's some things to just think about:<br> - Window moves from one display to another<br> - User has a window span multiple resolutions<br> - Two monitors of the same resolution, but different sizes<br> - Two monitors of the same size, but different resolutions<br> - The larger monitor is actually lower resolution.<br> - You drag a window from one monitor to another. Where does it appear and what size will it be. The obvious answer is physical size, right. But...<br> - ...it's a projector, TV or something else comparatively huge. What happens if you drag windows between them.<br> - Laptop gets suspended, gets plugged into a dock attached to a display, and wakes up with the lid shut i.e. primary display suddenly changes resolution. Where do the windows go now and how big should they be.<br> - Applications often clamp line widths to be exactly 1 pixel (and align them to pixels) to prevent weird rendering of thin lines and varying brightness caused by antialiasing. What do you do if there's two resolutions.<br> - The user has used the preferences to specify exactly how thick the lines should be and will be annoyed if it's ever wrong.<br> - Subpixel Rendering. Just everything to do with it.<br> - Respecting the system text size preferences well across applications, OSes and Toolkits.<br> - What you do and don't scale with the text size.<br> - The application might want to use different layouts and scaling depending on how much screen estate is available.<br> - You want to remember the size and position of windows across restarts, but the resolution has changed.<br> - Multi-window apps having to do all of the above at the same time across multiple monitors.<br> - One monitor gets unplugged or goes into standby. You better remember where and what size those used to be when it comes back. But those windows must also remain visible on the workspace. Bonus: What if the user moved the window in between. What if it's a different monitor. Do you pop up their sensitive browser window they forgot about on the office projector when it gets plugged in, because that's where they had it at home?<br> - Games, especially simulators, that want to work across multiple displays<br> - VR, just in general<br> - You want to implement the power saving feature that many phones have where they dynamically change the screen's render resolution (and hence dpi) depending on battery saving settings and the content of the display. But everything should stay where it is. This can not break any of the solutions to previously mentioned problems.<br> - Let's not forget various accessibility features like magnified views (they better not be pixelated, and it better not have to render the entire window at 32k to do it), people who need extra large text, people who need big buttons but want small text, etc.<br> <p> Many of these can be solved if you're willing to dedicate most of a lifetime to them, for sure. But they're not the kind of problems you can just solve once and for all with a good design. It takes a lot of work to get a good coverage of most of the edge cases and a lot of tuning and heuristics at all levels of the stack that must play well together and nobody notices, until they're wrong.<br> <p> There are also some bonus unsolveable problems though:<br> - Users generally want the contents of their displays to be big enough that every element has enough pixels, as small as possible to maximize screen space, but big enough physically to still be readable. What points those are depends on the person, their eyesight, device size, form factor, environment, etc.<br> - Applications all have their own, varying ways of adapting to various DPIs which were built to work exactly with the behavior that the developers assumed at the time and there's not really any way for compositor to figure out what those assumptions are.<br> - The perceived size changes depending on your distance to the object.<br> <p> Have fun :)<br> </div> Thu, 15 Aug 2024 18:03:22 +0000 I can't understand the logic for this https://lwn.net/Articles/985858/ https://lwn.net/Articles/985858/ DanilaBerezin <div class="FormattedComment"> <span class="QuotedText">&gt; Just look at the web: Do younreally think it would be as successful as it is if you had to actually program pages in rust, C++ or any other "proper" programming language?</span><br> <p> Ironically, if you actually do take a look at the *modern* web, you'll find that nearly everyone programs almost the entirety of their web pages in the "proper programming language" called Javascript. The original purpose of the web in providing an internationally accessible repository of documents has been relegated to the dustbin of history. The reality is that the web has been turned into its own application platform by its users. A transformation which was so egregiously abusive of it's original design that Google poured billions of dollars into creating an optimizing javascript engine, and when that proved to be insufficient and still painful, people decided to invent WASM. And now, people do (and in fact quite commonly) write entire web pages in Rust and C++ thanks to WASM.<br> </div> Thu, 15 Aug 2024 16:00:21 +0000 I can't understand the logic for this https://lwn.net/Articles/985779/ https://lwn.net/Articles/985779/ jengelh <div class="FormattedComment"> <span class="QuotedText">&gt;Having a simple language to describe UI has a host of advantages</span><br> <span class="QuotedText">&gt;</span><br> <span class="QuotedText">&gt;Just look at the web: Do younreally think it would be as successful as it is if you had to actually program pages in rust, C++ or any other "proper" programming language?</span><br> <p> And here we are, people programming web pages with Javascript (which is "proper" in the sense that it is imperative rather than declaring the UI).<br> </div> Thu, 15 Aug 2024 13:05:00 +0000 I can't understand the logic for this https://lwn.net/Articles/985774/ https://lwn.net/Articles/985774/ khim <p>How exactly is that “glib and irrlevant”?</p> <p>Now in addition to about half-dozen existing desktops that are half-done and understaffed we have got yet another one which is even more rough and unpolished and which would be even more understaffed unless they would convince some other group to abandon what they are doing.</p> <p>Which is highly unlikely to happen.</p> Thu, 15 Aug 2024 11:04:11 +0000 I can't understand the logic for this https://lwn.net/Articles/985772/ https://lwn.net/Articles/985772/ khim <font class="QuotedText">&gt; Just look at the web: Do younreally think it would be as successful as it is if you had to actually program pages in rust, C++ or any other "proper" programming language?</font> <p>That question is, essentially, impossible to answer before you would give criteria for “success”.</p> <p>Would it have been successful at attracting bazillion “designer” who created bazillion barely-usable sites? Probably not.</p> <p>Would it have been successful at facilitating the ability of people to communicate? Most likely yes, and even more than web achieves that today.</p> Thu, 15 Aug 2024 10:56:41 +0000 Display Scaling and multiple monitors https://lwn.net/Articles/985762/ https://lwn.net/Articles/985762/ farnz <blockquote> Excluding the case of using multiple monitors with different resolutions for now </blockquote> <p>Even the case of two monitors with the same resolution and different sizes is non-trivial to get right. In a perfect world, you want windows to maintain their apparent size as they move from screen to screen, and for the top and bottom to line up as the window moves across monitors. But that has two big problems to get right: <ol> <li>The pixel density on each monitor can be different; for example, I have two outputs here: a big 32" screen at 3840x2160, and a smaller 17" screen at 3200x2400 (different aspect ratios, too, in this case); I previously had a big 32" screen at 3840x2160, and a 15" screen also as 3840x2160. Finding the scale factor such that they both have the same scaled density is beyond the X11 model, since the X11 model is based around pixmap per window - but a pixmap cannot change pixel density as it moves from output to output. Thus, you need the compositor involved in scaling to the viewable area. <li>It's rare for users to carefully align their monitors such that the top and bottom of the viewable areas are precisely aligned; that means that you not only need to be aware of different scale factors, but you also need to allow for the non-alignment of monitors. X11 can do this. </ol> <p>The simple solution is to have a virtual screen setup where the application renders to a pixmap sized such that its pixel density is an integer factor of all output's pixel densities; you can then trivially scale down subsets of the pixmap to fit the real screens. This is, however, exceptionally computationally intensive in many cases - you can (for example) find yourself needing to scale down by 11 for one output, and 13 for the other, and a good-quality downscale filter is approximating a sinc response which means that it needs to read all pixels of the pixmap for every output. Worse, if you're doing this to fit the X11 model, you have to do it for all windows, even ones that don't move across screens. <p>The medium difficulty solution is to determine a scale factor that's empirically good enough, and a scaling algorithm like Lanczos resampling that's computationally less intensive, but still produces decent results. This is the best you can do in the X11 protocol, but it's still computationally intensive, because you're always scaling down the pixmap the application produces. Again, though, you have to do this for all windows, even ones that don't move across screens, in the X11 model. <p>The hard approach, and the one Wayland takes, is to determine the scale factor as the window moves; when a window is on a single output, you render at the correct scale factor for that output alone, and the compositor can just blit your output pixmap to the display (or use hardware overlays). When it's across two outputs, you fall back to the medium difficulty solution. The rendering application knows the scale factor from output pixel to real world units, and can tell the compositor how to scale window pixmap pixels to output pixels (which means that you can let the compositor work out how to scale your 1280x720 pixmap from video decoding to the 1600x900 window the user's given you). <p>The close to perfect approach, which Wayland could adopt in the future, is to ask windows spread across two outputs to render twice, with a region given to tell the window where to render for each scale factor. This then gives you the potential for minimum rendering and perfect appearance as a window moves across outputs, at the expense of more complexity, and applications having to work out how to minimize their rendering (a trivial implementation renders the whole window twice with a scissor set up to remove the unwanted bit, but that's problematic because it results in wasted effort setting up to render (e.g.) parts of the PDF that are off-screen, only for the resulting rasterization to be discarded by the scissor. Thu, 15 Aug 2024 09:24:29 +0000 Display Scaling https://lwn.net/Articles/985753/ https://lwn.net/Articles/985753/ smurf <div class="FormattedComment"> <span class="QuotedText">&gt;&gt; Well, if you exclude the actual hard problems, the rest is very easy indeed.</span><br> <p> <span class="QuotedText">&gt; Would you mind to elaborate?</span><br> <p> Well. Did you ever try to place two monitors with disparate (and possibly fractional) resolutions next to each other and have a window seamlessly stretch from one screen to the other?<br> <p> Using X that's basically impossible. There simply is no way to shoehorn the ability to do this into X11. Each window is backed by a single bitmap with a predetermined resolution, period end of discussion.<br> <p> With Wayland it's a no-brainer. It. Just. Works.<br> <p> The flip side is that Wayland has to re-implement some of the *other* features that were a no-brainer under X. Different display paradigms and all that. A lot of work went into X over its lifetime and you can't replicate all of it instantly.<br> <p> Not to mention: on top of that, somebody needs to write the code to support the new cool stuff that's now possible. Just one example: Say you temporarily need to add another screen to your computer but you don't have a free display output? Fine, let's dynamically add a headless display to the running compositor and show it on your laptop using VNC. Done.<br> <p> Granted, writing code for new cool stuff is a lot more sexy than writing code for all the old boring usecases that we still need and like to keep working, and granted we could do without the Gnome vs. KDE vs. wlroots nonsense that accumulated during the transition. But we're getting there IMHO.<br> </div> Thu, 15 Aug 2024 07:25:57 +0000 I can't understand the logic for this https://lwn.net/Articles/985748/ https://lwn.net/Articles/985748/ motk <div class="FormattedComment"> Can we please stop quoting this? It's glib and irrlevant.<br> </div> Thu, 15 Aug 2024 01:20:33 +0000 Display Scaling https://lwn.net/Articles/985716/ https://lwn.net/Articles/985716/ pizza <div class="FormattedComment"> Fonts are easy, as they were largely designed to be scalable from the outset. Which leaves... everything else.<br> <p> The problem can be summed up with this question: Is the intent for "high resolution" to "fit more stuff on screen" or "fit the same stuff, only crisper"? <br> <p> In the past, it was nearly always the former, but now it's increasingly the latter.<br> <p> Those two intents require very different approaches to how you go about things, and which applies depends on both the application and the user preferences.<br> <p> Meanwhile. Do all elements of your application have awareness of the pixels-per-inch resolution of the screen? Which ones are handled by the application, and which are provided by external toolkits? Which should be rendered at a higher resolution and which ones should be linearly scaled? Is your application layout specified with pixel-based positioning, or something else? Etc etc.<br> <p> (BTW, from what I recall, back in the "good old X days" all sorts of stuff went quite wonky if you told X that you were not using the default of 96dpi. Because the _font_ dpi wasn't the same as the _server_ dpi. So no, this wasn't ever a properly solved problem)<br> </div> Wed, 14 Aug 2024 21:06:11 +0000 I can't understand the logic for this https://lwn.net/Articles/985705/ https://lwn.net/Articles/985705/ Agrippa <div class="FormattedComment"> Two points: <br> <p> (1) The last major user-interface change for users in Gnome occurred with 3.0. That change was controversial, for sure. But Gnome 3 was released in 2011–13 years ago. So, from a user’s perspective, Gnome is not making major interface changes all that often. Mate and Cinnamon originate from the old change.<br> <p> (2) Also controversial and more frequent are the changes to the JavaScript code that runs Gnome shell and can affect extensions. Changing the base code happens; this is not unusual or “arbitrary,” necessarily. The choice was to allow extensions full power to change the desktop in all sorts of ways, but any changes to the base code can affect the extensions, which are essentially hot patches. The alternative was to present API stable extension libraries to allow extension authors to change the desktop in more future-proof, but limited, ways. My understanding is that the community, including the extension authors, greatly preferred the first approach—which is what Gnome does. The downside , of course, is that extensions might break from release to release. The Gnome folks try to mitigate this problem by releasing an extension-port guide with new releases. <br> <p> So, I would dispute that Gnome has a history of making frequent “arbitrary changes.” That said, System76 is free to go its own way and reinvent the desktop in a way that is under the company’s control. The upside is that the company is not beholden to the Gnome community or legacy code—it’s a fresh start. The downside is that starting mostly from scratch is very difficult. If System76 can pull it off, kudos to it.<br> </div> Wed, 14 Aug 2024 19:26:19 +0000 Display Scaling https://lwn.net/Articles/985704/ https://lwn.net/Articles/985704/ Nikratio <div class="FormattedComment"> <span class="QuotedText">&gt;&gt; Excluding the case of using multiple monitors with different resolutions for now, does someone know where we went wrong?</span><br> <span class="QuotedText">&gt;</span><br> <span class="QuotedText">&gt; Well, if you exclude the actual hard problems, the rest is very easy indeed.</span><br> <p> Would you mind to elaborate? I'm really asking for my education, not to imply that anyone working on this back then or today isn't doing the best that's possible.<br> </div> Wed, 14 Aug 2024 19:02:26 +0000 Display Scaling https://lwn.net/Articles/985700/ https://lwn.net/Articles/985700/ intelfx <div class="FormattedComment"> <span class="QuotedText">&gt; Excluding the case of using multiple monitors with different resolutions for now, does someone know where we went wrong?</span><br> <p> Well, if you exclude the actual hard problems, the rest is very easy indeed.<br> </div> Wed, 14 Aug 2024 18:15:32 +0000 Display Scaling https://lwn.net/Articles/985696/ https://lwn.net/Articles/985696/ Nikratio <div class="FormattedComment"> There is one thing that I'd really like to understand: where did we go wrong such that display scaling (especially fractional) is nowadays so hard and worth highlighting?<br> <p> For as long as I can remember, the X server had a concept of the monitor's dpi, and font sizes could be specified in inches rather than pixels. I distinctly remember entering my exact monitor dimensions into the XFree86 config file, so that xrandr would show the physically correct DPI, and ghostscript would render a PDF of an A4 page in such a way that I could put an physical A4 page on my monitor and they would perfectly overlap.<br> <p> Yet, nowadays showing everything twice as big was a major accomplishment that unlocked the use of 4K displays, and fractional scaling (as it was available in the 90's) has seemingly become a northstar goal that we're unlikely to ever reach completely.<br> <p> Excluding the case of using multiple monitors with different resolutions for now, does someone know where we went wrong? What design decisions locked us into the mess that we now can't get rid of?<br> </div> Wed, 14 Aug 2024 18:08:42 +0000 False dichotomies, anyone? https://lwn.net/Articles/985588/ https://lwn.net/Articles/985588/ rsidd <div class="FormattedComment"> There isn't a large list of wlroots-based compositors. Sway, and some niche projects like wayfire (hyprland recently migrated away from wlroots). And none of those would be suitable as something to build a DE on top of. <br> <p> They started with smithay which is like wlroots for rust. And their efforts should enable a large rust ecosystem on wayland. <br> </div> Wed, 14 Aug 2024 10:40:32 +0000 I can't understand the logic for this https://lwn.net/Articles/985575/ https://lwn.net/Articles/985575/ mbunkus <div class="FormattedComment"> From a practical point of view: Qt's philosophy is different that STL's wrt. usability. If it makes usage easier, Qt is very eager to add utility functions to its classes, be it strings, containers or date/time types. The C++ standards committee on the other hand prefers users us the generic STL algorithms for anything they can be used. In reality this means that things are often a lot easier to remember, type, research, read &amp; understand than equivalent STL-based code.<br> <p> One thing that Qt simply doesn't have is differences in implementation quality. As the STL is only a spec, there are multiple implementations of that spec by multiple vendors (gcc's libstdcpp, clang's libstdc++, Apple clang's libstdc++, Microsoft's, the mingw implementation which is gcc's but adjusted for Windows or something like that, most likely others). Not only do they implement various differing parts of the STL specs, their quality of implementation is also often lacking. Here are two examples:<br> <p> The standard's std::filesystem::path library is based on Boost's library of the same name, boost::filesystem::path. Paths are a tricky thing. Boost's implementation has had a LOT of years to mature, whereas the STL implementations haven't. One result is that the implementation in mingw doesn't support UNC paths for Windows properly (things such as \\?\C:\Test\Foo.txt). Therefore I cannot implement a simple algorithm such as "look through the directory structure going upwards until you find a directory containing a certain file" properly as certain test functions ("is this path empty?" or "is this the root path?") just don't work with it. The corresponding algorithm works just fine with boost::filesystem::path with exactly the same functions (as in, the STL's spec copied Boost's implementation almost verbatim). It also just works with Qt's path libraries. Of course this has been reported as a bug but hasn't been fixed yet.<br> <p> Another bug due to different quality in STL implementations is again mingw's, this time with random number generation. For C++ with the STL you're supposed to use something like the following in order to generate random numbers:<br> <p> std::random_device r;<br> std::default_random_engine e1(r());<br> std::uniform_int_distribution&lt;int&gt; uniform_dist(1, 6);<br> <p> Unfortunately std::random_device on mingw doesn't see the RNG (which the STL specs allow!). Therefore each run of the program produces the same sequence of numbers. Meaning I cannot use std::random_device in cross-platform compatible code.<br> <p> And don't get me started on the quality of the regular expression engine in the STL… Pretty much the worst one out there.<br> <p> For me using Qt's own classes is much more satisfying &amp; much less annoying. It's easier to use, easier to reason about, quicker to implement, produces faster code, more correct code.<br> </div> Wed, 14 Aug 2024 08:31:15 +0000 I can't understand the logic for this https://lwn.net/Articles/985543/ https://lwn.net/Articles/985543/ Heretic_Blacksheep <div class="FormattedComment"> From what I understand of the reasoning, and I don't really have any internal insight mind you I don't work for System76, they don't want arbitrary Gnome changes to screw up the desktop they're basing a commercial product upon. It's a bad user experience if an upgrade wipes out the customization a user spent a product's lifetime accumulating only to have the Gnome team decide to hare off in another direction entirely and wipe everything out both System76 and their users have done. It's about control over their product's customer experience.<br> <p> I can see System76's side, and I can understand Gnome's side in not wanting to be tied down. But as a user, I don't like having my workspace disrupted by arbitrary changes that may not benefit me, either. Gnome has a historical track record of doing this very thing (and whataboutisms doesn't change the fact that it occurs and has very recently). Gnome fans may not care, but a lot of people use Gnome simply because it's the default on their distribution. For these people those changes are unnecessary pain points.<br> <p> I have other beefs with Pop_OS in other areas, but I can certainly understand why they're building a UI stack they control.<br> </div> Tue, 13 Aug 2024 19:31:54 +0000 Fragmentation https://lwn.net/Articles/985529/ https://lwn.net/Articles/985529/ pizza <div class="FormattedComment"> <span class="QuotedText">&gt; There's definitely a lack of consistency. There are lots of client-side decorations on Windows and even Microsoft can't stick to a single style across its apps.</span><br> <p> Historically, Microsoft has been the worst offender in that respect.<br> <p> (Not unlike how Apple routinely violates its own UI guidelines. The same guidelines they doggedly require everyone else to follow)<br> </div> Tue, 13 Aug 2024 17:45:01 +0000 Fragmentation https://lwn.net/Articles/985528/ https://lwn.net/Articles/985528/ mathstuf <div class="FormattedComment"> There's definitely a lack of consistency. There are lots of client-side decorations on Windows and even Microsoft can't stick to a single style across its apps.<br> </div> Tue, 13 Aug 2024 17:33:07 +0000 I can't understand the logic for this https://lwn.net/Articles/985503/ https://lwn.net/Articles/985503/ mathstuf <div class="FormattedComment"> Qt predates the STL's release, so it made its own containers. In the process, it made decisions that the STL did not go with including CoW support for its containers. Now with move semantics that is less useful (e.g., forgetting a `const&amp;` on a `std::vector&lt;std::string&gt;` parameter was very costly before moves were a thing), but by then there was 15+ years of existing API patterns laid down and at that point it's just easier to add move constructors to the existing APIs than to uproot it all and go with the STL instead.<br> </div> Tue, 13 Aug 2024 16:52:58 +0000 False dichotomies, anyone? https://lwn.net/Articles/985489/ https://lwn.net/Articles/985489/ smurf <div class="FormattedComment"> <span class="QuotedText">&gt; which left building a custom desktop from the ground up.</span><br> <p> Well, no. They could have chosen some suitable wlroots-based compositor and added their configuration on top of that.<br> <p> </div> Tue, 13 Aug 2024 15:57:46 +0000 stable rust https://lwn.net/Articles/985494/ https://lwn.net/Articles/985494/ hunger <div class="FormattedComment"> You can build dynamic c-style libraries or binaries... what else is there? Both are leaf nodes from the perspective of the rust ecosystem.<br> </div> Tue, 13 Aug 2024 15:51:54 +0000 Fragmentation https://lwn.net/Articles/985487/ https://lwn.net/Articles/985487/ jzb <p>I don't <em>think</em> so, in general, but I would also admit I have a lot less current experience with Windows. As I understand it, things like title bar buttons (close, minimize, maximize) are pretty uniform for Windows apps even when running older programs.</p> <p>But it's possible my understanding is a bit flawed&mdash;I had Windows on a work laptop for about six months in 2022, and time may have blurred things a bit and/or my use might not have matched other peoples' experiences.</p> Tue, 13 Aug 2024 14:25:19 +0000 Fragmentation https://lwn.net/Articles/985438/ https://lwn.net/Articles/985438/ daenzer <div class="FormattedComment"> <span class="QuotedText">&gt; For users new to Linux, which would include much of System76's target audience, it is likely more confusing.</span><br> <p> Doesn't Windows have a similarly fragmented user experience? (Not saying it's a good thing per se, just that it might not be that confusing for most Linux newcomers)<br> </div> Tue, 13 Aug 2024 12:49:15 +0000 stable rust https://lwn.net/Articles/985437/ https://lwn.net/Articles/985437/ tzafrir <div class="FormattedComment"> That's a reasonable argument if rust is limited to leaf packages (specifically: executable binaries). What about any other type of shared code? Can it be a distribution package?<br> <p> A quick search in Debian shows many librust-$foo-dev packages.<br> </div> Tue, 13 Aug 2024 12:40:52 +0000 I can't understand the logic for this https://lwn.net/Articles/985435/ https://lwn.net/Articles/985435/ rrolls <div class="FormattedComment"> Hooray, I'm not alone :)<br> </div> Tue, 13 Aug 2024 12:10:27 +0000 stable rust https://lwn.net/Articles/985431/ https://lwn.net/Articles/985431/ mbukatov <div class="FormattedComment"> <span class="QuotedText">&gt; You do not insist on someone packaging header-only libraries in C or C++ either, do you?</span><br> <p> Why not? Fedora does that: <a href="https://docs.fedoraproject.org/en-US/packaging-guidelines/#_packaging_header_only_libraries">https://docs.fedoraproject.org/en-US/packaging-guidelines...</a> <br> </div> Tue, 13 Aug 2024 10:50:37 +0000 stable rust https://lwn.net/Articles/985430/ https://lwn.net/Articles/985430/ hunger <div class="FormattedComment"> Why do you care how the sources of dependencies are shipped?<br> <p> In the end cargo will download the same SHA from crates.io... either on your system, on the system of somebody vendoring all dependencies into one tarball to ship as part of the package build sources of cosmic-randr or as part of a package for each of the dependencies when those are packaged by the distribution<br> <p> In all cases the same code will end up in one self-contained, statically linked binary. The user of that binary will not need any of the sources of the dependencies. It really does not matter whether those are packaged or not.<br> <p> I really see no point whatsoever to package any library crate... they are just source code, nothing more. You do not insist on someone packaging header-only libraries in C or C++ either, do you?<br> </div> Tue, 13 Aug 2024 10:24:31 +0000 I can't understand the logic for this https://lwn.net/Articles/985429/ https://lwn.net/Articles/985429/ paulj <div class="FormattedComment"> No, MATE is the answer! GNOME 2 forever - the perfect desktop.<br> <p> Splitter!<br> </div> Tue, 13 Aug 2024 10:02:25 +0000 I can't understand the logic for this https://lwn.net/Articles/985426/ https://lwn.net/Articles/985426/ anselm <blockquote><em>Having a simple language to describe UI has a host of advantages</em></blockquote> <p> Indeed? We knew that in the early 1990s, with Tcl/Tk. </p> <p> (Tcl/Tk may not be everyone's cup of tea, but at the time it ran in circles around everything else in that space.) </p> Tue, 13 Aug 2024 09:58:29 +0000