|
|
Log in / Subscribe / Register

Maybe a hint?

Maybe a hint?

Posted Jan 14, 2026 16:59 UTC (Wed) by Wol (subscriber, #4433)
In reply to: Maybe a hint? by wtarreau
Parent article: Debian discusses removing GTK 2 for forky

> So actually the problem might be the newer versions themselves if programs don't adopt them for that long.

Or there's an imbalance between resources available to GTK and other projects? It wouldn't surprise me if Gnome is throwing a lot of resource at GTK. I get the impression that there are only a couple of people actually working on gimp?

That's not to say upgrading your toolkit version isn't a good idea. It's just that projects may have better things to do with their limited time, than to climb on board the upgrade treadmill.

Cheers,
Wol


to post comments

Maybe a hint?

Posted Jan 14, 2026 17:29 UTC (Wed) by pizza (subscriber, #46) [Link] (58 responses)

> Or there's an imbalance between resources available to GTK and other projects?

It's basically "just" this.

[barely-]unmaintained applications written against GTK2 (or heck, GTK1) don't have the resources to port to something more recent.

Maybe a hint?

Posted Jan 14, 2026 22:10 UTC (Wed) by fraetor (subscriber, #161147) [Link] (57 responses)

In retrospect, its not entirely surprising that GTK2 apps lack the resources to be upgraded. GTK2 was around for what might have been the heyday of desktop GUI programming, however by the time GTK3 was mainstream the web had improved to such an extent that newer applications were written with web interfaces instead.

At least at my place of work, GTK2 is used for a lot of internal applications that nowadays would be written with web interfaces. Many of these applications are "finished", so there is little appetite to spend effort upgrading toolkits. When there is resources to spend on an upgrade, it most typically results in a web interface, or sometimes even a TUI.

Maybe a hint?

Posted Jan 15, 2026 1:04 UTC (Thu) by rgmoore (✭ supporter ✭, #75) [Link] (53 responses)

Many of these applications are "finished", so there is little appetite to spend effort upgrading toolkits.

I think this gets at a very deep and fundamental divide that creates a lot of problems like this. Some people, especially people who develop languages and programming tools, see development as a never ending process: there is always more work to be done, new versions to be developed, etc. Other people are trying to build a tool to solve one specific problem, and once that problem is solved they see the program as finished with the exception of fixing any bugs that are discovered along the way. The people who see development as a never ending process will eventually get to the point they want to make backwards incompatible changes, which strands the people who depended on their language or tools and who considered their projects to be finished.

It's a tough problem, because either way you're trying to force somebody into doing programming work they don't want to do and didn't sign up for. Either tool and language makers are forced to continue supporting old versions they think are obsolete, or programmers are forced to constantly update programs for no real benefit to their program's functionality.

Maybe a hint?

Posted Jan 15, 2026 1:40 UTC (Thu) by pizza (subscriber, #46) [Link]

> The people who see development as a never ending process [....]

...are probably being paid (or otherwise getting some tangible benefits) for their efforts.

Whereas the ones that see their applications as "finished" (ie "solves the problem it was written for") do not.

Maybe a hint?

Posted Jan 15, 2026 3:49 UTC (Thu) by pabs (subscriber, #43278) [Link] (33 responses)

I wonder why people without lots of resources for maintenance add dependencies on projects with a reputation for fast development and backwards incompatibility in the first place.

There are better options for them like using slower-moving/finished toolkits, writing minimal custom toolkits, TUIs, CLIs or just making libraries and leaving UIs to separate projects.

Maybe a hint?

Posted Jan 15, 2026 4:16 UTC (Thu) by dskoll (subscriber, #1630) [Link] (2 responses)

Sometimes people pick what they know, or what they want to learn, or what all the cool kids are using. And they don't think beyond that.

I have some GUI programs written in Tcl/Tk which is used nowadays by almost nobody and people generally scoff about... but my programs written in 2002 still work perfectly with modern Tcl/Tk.

Maybe a hint?

Posted Jan 15, 2026 7:02 UTC (Thu) by pabs (subscriber, #43278) [Link]

I love that `git gui` and `gitk` are written in Tcl/Tk because it means they are very unlikely to go away.

Tcl/Tk

Posted Jan 25, 2026 21:30 UTC (Sun) by pschneider1968 (guest, #178654) [Link]

FWIW, Tcl/Tk is still used heavily in the Scid Chess DB software (see https://scid.sourceforge.net/ ) which I use on an almost daily basis to enter, analyze and maintain my chess games. It's a great piece of software, and the most viable FOSS alternative to the proprietary and costly Chessbase.

Maybe a hint?

Posted Jan 15, 2026 12:20 UTC (Thu) by pizza (subscriber, #46) [Link] (3 responses)

>I wonder why people without lots of resources for maintenance add dependencies on projects with a reputation for fast development and backwards incompatibility in the first place.

Uh, GTK2's (directly supported) lifecycle was just shy of two decades. Actively-supported GTK3 is well over a decade into its lifecycle, and GTK4 is about five years in.

If that is "fast development" then I'd hate to see what you call anything its more "modern" contemporaries.

Maybe a hint?

Posted Jan 16, 2026 13:20 UTC (Fri) by wtarreau (subscriber, #51152) [Link] (2 responses)

Maybe what's missing is a "gtk2-on-3" wrapper that presents gtk2 API and semantics from gtk3 ?

GTK-2-on-3 wrapper

Posted Jan 17, 2026 8:33 UTC (Sat) by gioele (subscriber, #61675) [Link] (1 responses)

> Maybe what's missing is a "gtk2-on-3" wrapper that presents gtk2 API and semantics from gtk3 ?

That's kind of hard to do because GKT 2 and GTK 3 are paradigmatically different. It is not just a matter of updating a couple of functions from a deprecated API to a new one. For example the base "class" GtkObject is gone and pretty much all APIs related to custom widgets or surface drawing have been removed and replaced by Cairo.

And even if a GTK 2 to GTK 3 wrapper were possible, a GTK 3 to GTK 4 is categorically impossible (or possible only in very limited cases) because the API for the main event loop has changed. (And all the styling is fundamentally different, so an application would look seriously broken with widgets all over the place.)

A GTK 2-to-3 wrapper only would buy us a few years before GTK 3 is fully abandoned upstream and we are back to the same discussion.

Compatibility layers

Posted Jan 18, 2026 9:39 UTC (Sun) by swilmet (subscriber, #98424) [Link]

The GTK project removes *all* deprecated API in the next major version. So to ease the port, compatibility layers could be created for some features.

For example GtkTreeView (tree and list widget, quite a complex beast) is deprecated in GTK 4, so it'll go away in GTK 5. GtkTreeView could be maintained in a separate module to still provide that API for GTK 5.

The same for GtkUIManager (deprecated in GTK 3 and so removed from GTK 4). It could be maintained for GTK 4 as a compatibility layer.

Perhaps it's time to create a gtk-legacy module. Classic desktop environments (Cinnamon and its X-apps, or MATE) would benefit from it.

Another, complementary way would be a gtk-future module, back-porting some new APIs to a previous major version.

Maybe a hint?

Posted Jan 15, 2026 14:03 UTC (Thu) by epa (subscriber, #39769) [Link]

Back in the day GTK 2 was arguably the standard user interface toolkit in many Linux environments (broadly speaking those aligned with GNOME rather than KDE). It would have seemed like a reasonable choice if you wanted some kind of "enterprise" support, since Red Hat's own installer (Anaconda) used it, as well as many of the bundled desktop applications, even if the big names like Firefox and Libreoffice went their own way.

As I recall it the GTK 3 release was accompanied by a big announcement that it would guarantee backwards compatibility, though only in the future, of course.

Maybe a hint?

Posted Jan 15, 2026 18:38 UTC (Thu) by rgmoore (✭ supporter ✭, #75) [Link] (24 responses)

I think the same attitude that encourages people to think about projects as finished means they don't think very hard about long-term maintainability. Their goal is to write a program that achieves their goal, and then they'll move on to the next project. They're going to pick whatever language or tool makes the job of writing their program easy, and they never think about where they're going to be in a year, much less 10.

One could just as easily ask why someone developing an environment for other people to use is so careless with those people's time. Every time they deprecate an API because it's too hard to maintain, they're pushing the work of adapting to the new API onto their downstream. The effort required by all their downstream users adapting to the new API is probably many times greater than the effort required to maintain it, so it's a huge net loss. There may be some times when breaking an API is still necessary- I think the change in strings between Python 2 and 3 is probably a good example- but language and library writers need to think long and hard before making breaking changes.

Maybe a hint?

Posted Jan 15, 2026 18:58 UTC (Thu) by pizza (subscriber, #46) [Link] (6 responses)

> I think the same attitude that encourages people to think about projects as finished means they don't think very hard about long-term maintainability.

Or perhaps one should see this from a different perspective -- "I wrote this software to solve an immediate problem that I had, and I'm providing it AS-IS WITH NO WARRANTY WHATSOEVER. Anyone expecting or demanding that I continually update or otherwise maintain this software until the day I die can perform an anatomically improbable act with themselves."

> They're going to pick whatever language or tool makes the job of writing their program easy, and they never think about where they're going to be in a year, much less 10.

....welcome to human nature?

Maybe a hint?

Posted Jan 15, 2026 23:18 UTC (Thu) by rgmoore (✭ supporter ✭, #75) [Link] (5 responses)

I don't mean to sound overly critical! I've written software that is still in use long, long after I ever expected it to remain in service, and it's 100% luck that I wrote it in a language that hasn't required endless maintenance since then. I'm definitely a member of the crew who wants to write it and forget it. I wish that we could stop the constant churn of underlying technologies that drive the development churn. I understand that we can't stop this stuff completely- we do learn things that need to be incorporated into language and tool design- but there seems to be an awful lot that is driven by software companies' desire to sell new licenses rather than anything that genuinely needs updating.

Maybe a hint?

Posted Jan 18, 2026 9:50 UTC (Sun) by swilmet (subscriber, #98424) [Link] (4 responses)

In GTK's case, the deprecations/removals are because there are just a handful of GTK core developers/maintainers. They cannot reasonably maintain forever all APIs, which is understandable.

However as noted in an earlier comment, a "gtk-legacy" and "gtk-future" modules could be created to provide partial compatibility layers.

Maybe a hint?

Posted Jan 18, 2026 12:00 UTC (Sun) by pizza (subscriber, #46) [Link] (3 responses)

> In GTK's case, the deprecations/removals are because there are just a handful of GTK core developers/maintainers. They cannot reasonably maintain forever all APIs, which is understandable.

Absolutely.

>However as noted in an earlier comment, a "gtk-legacy" and "gtk-future" modules could be created to provide partial compatibility layers.

Maintained by whom, exactly? The same folks that "understandably" removed all of the old APIs because they don't have the resources to "reasonably maintain them forever"?

If that maintenance is by some other mythical entity, what stops them from simply maintaining GTK2 in its current state?

Maybe a hint?

Posted Jan 19, 2026 3:11 UTC (Mon) by swilmet (subscriber, #98424) [Link] (2 responses)

What I had in mind is that big projects like Linux Mint, MATE, Ardour, GIMP and others would coordinate some efforts to ease GTK usage outside what GTK maintainers are able to provide. What I've achieved for text editor projects is a few libgedit-* modules. For example libgedit-amtk for extending GTK 3 with an alternative API to create menus and toolbars (because GtkUIManager is deprecated in GTK 3).

Maybe a hint?

Posted Jan 19, 2026 17:08 UTC (Mon) by rgmoore (✭ supporter ✭, #75) [Link] (1 responses)

What I had in mind is that big projects like Linux Mint, MATE, Ardour, GIMP and others would coordinate some efforts to ease GTK usage outside what GTK maintainers are able to provide.

I don't know that those projects have the resources to do what you're suggesting. For example, GIMP is still in the process of fully transitioning from their old libraries to GEGL. They don't have spare developers to devote to maintaining old GTK; if they had more resources they'd want to devote them to speeding up development of their graphics stack rather than maintaining underlying libraries. I agree that maintaining those old libraries would be a worthwhile task, but I don't think projects that are already operating on a shoestring are the place to find the resources.

Maybe a hint?

Posted Jan 20, 2026 9:23 UTC (Tue) by swilmet (subscriber, #98424) [Link]

At this point we can say that probably almost all FLOSS desktop components are under-resourced, except rare exceptions (thinking about Qt, Blender, LibreOffice maybe?).

Maybe a hint?

Posted Jan 15, 2026 23:15 UTC (Thu) by Wol (subscriber, #4433) [Link] (15 responses)

> and they never think about where they're going to be in a year, much less 10.

Or maybe they HAVE thought about where they're going to be in a year, and the spec says "this software is feature-complete"?

I bang on about truth tables, but if you've done your truth table, and addressed every option, what else is there to do?

Cheers,
Wol

Problem with the concept of "feature-complete"

Posted Jan 16, 2026 12:31 UTC (Fri) by farnz (subscriber, #17727) [Link] (8 responses)

The problem with the whole idea there is that you never have a complete and fully explicit specification for any software; there's always parts that are either implicit (and that you may not know about at all, because you meet them accidentally, like "needs to have an attractive colour scheme" when you have good taste in colours), and there are pieces that are time or environment dependent (like "doesn't cost too much in terms of low-end consumer connectivity", which in 1999 meant thinking about how to minimise the number of seconds spent online, and in 2025 means thinking about how to minimise the number of bytes used).

That makes it very hard to accurately declare yourself "feature-complete", because you don't have a complete and accurate spec, and that's a requirement to declare yourself complete.

Problem with the concept of "feature-complete"

Posted Jan 16, 2026 13:59 UTC (Fri) by pizza (subscriber, #46) [Link] (7 responses)

> That makes it very hard to accurately declare yourself "feature-complete", because you don't have a complete and accurate spec, and that's a requirement to declare yourself complete.

You're over-thinking this, applying formal engineering principles to something that ... isn't.

The _actual_ spec involved: "It does what I need it to do"

Granted, there's usually "...without any problems too annoying for me to ignore" implicitly tacked on. [1]

With that in mind, yes, software is quite easily "finished" from the perspective of the person writing it.

...and as I so often point out here, if random other folks out there disagree, they can try to persuade the author to change their mind [2] [3] or fix the problem to their own satisfaction. Otherwise... <taps the sign>

[1] which in turn implies "..and/or worth the effort to fix"
[2] As opposed to "demand / berate / complain / threaten / ... "
[2] I've found that home-baked desserts with a side of cash does wonders.

Problem with the concept of "feature-complete"

Posted Jan 16, 2026 15:49 UTC (Fri) by farnz (subscriber, #17727) [Link] (6 responses)

No, I'm saying that the formal engineering principle of "fully meets the specification, therefore it's complete and needs no further work" rarely, if ever, applies to software, because the specification is itself incomplete in the ways we're both describing.

Now, you can declare that you're not working on this any more (for any reason - not just because you think it's "good enough"), and that's separate. But it's extremely rare (I cannot think of software in this state, not even TeX meets it) for you to have a complete enough specification that you can declare the software "complete against the specification".

Separate to that is the problem with people neither being able (or, in some cases, willing) to fund maintainers doing "no change that I notice", nor to accept "this stops being maintained". At some point, something has to give - either people find ways to fund maintenance that meets their needs, or they have to accept that sometimes, good things come to an end. And in some ways, that's harder with Free software, since at least with proprietary software, the usual way for good things to come to an end was for the company to go bankrupt (with rare exceptions), and thus it's clear why you can't buy it any more, while with Free software, it's people walking away from the project.

Problem with the concept of "feature-complete"

Posted Jan 16, 2026 16:20 UTC (Fri) by pizza (subscriber, #46) [Link] (5 responses)

> No, I'm saying that the formal engineering principle of "fully meets the specification, therefore it's complete and needs no further work" rarely, if ever, applies to software, because the specification is itself incomplete in the ways we're both describing.

Uh, the spec is, by definition, what determines if something has been "completed" or not.

...Whether or not the spec itself is "complete" (according to whom, exactly?) or can ever be considered truly "complete" is an entirely separate issue.

Problem with the concept of "feature-complete"

Posted Jan 16, 2026 17:39 UTC (Fri) by farnz (subscriber, #17727) [Link] (4 responses)

No - if the specification is incomplete, then you cannot say if something is completed or not, since when the gaps in the specification are filled in, you may find out that something you thought was correct against the specification was, in fact, only correct against your assumptions.

A specification is complete if there are no gaps in the specification, where a relevant part of the system is not specified, but it's assumed that people's "best efforts" will result in the required outcome happening anyway. In other words, the specification is only complete if, for all things you care about, the specification says (either directly, or by reference) what's acceptable, rather than being silent on that point, and the only things that are unspecified are things where it does not matter to you.

It's very common for specifications to be incomplete - for example, the specification for rewiring a building I worked in did not say that if all switches are in the same position, power should either be consistently off, or consistently on, because there's a local norm that means that most electricians will do that anyway, as a side effect of it being considered part of doing a good job.

And this is a big deal with software, since we have relatively few reference documents that would provide most of a spec by reference; the rewire job brought in around 700 pages of electrical specification by reference to a standard, for example, and still missed something that the building owner cared about, because he didn't even realise that it was something that "should" be specified somewhere. As it happened, there was one set of switches where doing that was relatively challenging, and the contractor didn't bother - he chose to fill in the gap in the specification with his own preferences, and the building owner had to pay to have it redone, but this time with the gap in the specification filled in.

Problem with the concept of "feature-complete"

Posted Jan 16, 2026 20:22 UTC (Fri) by pizza (subscriber, #46) [Link] (3 responses)

> No - if the specification is incomplete, then you cannot say if something is completed or not.

I'm sorry, but a "complete specification" dwells in the realm of spherical cows, and as such is utterly irrelevant for things built in the real world.

Problem with the concept of "feature-complete"

Posted Jan 18, 2026 11:57 UTC (Sun) by farnz (subscriber, #17727) [Link] (2 responses)

That's exactly my point - if you're saying that this thing is "feature complete and will never need further maintenance", you're also implying that the specification against which you claim it's "feature complete and will never need further maintenance" is also itself complete and will never need updating.

And I think we'd be a lot better off as a wider software community if we stopped trying to pretend that there is such a thing as a software project that's "completed, will never need further work", and accepted that software that's not maintained is a liability.

Of course, that doesn't force anyone to maintain it - you can abandon anything for any reason - but it does mean that if you're using unmaintained software, you need to be aware that you're taking on that liability.

Problem with the concept of "feature-complete"

Posted Jan 18, 2026 12:32 UTC (Sun) by pizza (subscriber, #46) [Link] (1 responses)

> And I think we'd be a lot better off as a wider software community if we stopped trying to pretend that there is such a thing as a software project that's "completed, will never need further work", and accepted that software that's not maintained is a liability.

Ok, congratulations... by your definition nearly all software is "unmaintained" and is therefore a "liability". What's step two?

> Of course, that doesn't force anyone to maintain it - you can abandon anything for any reason - but it does mean that if you're using unmaintained software, you need to be aware that you're taking on that liability.

In other words... the "AS-IS, NO WARRANTY WHATSOEVER" status quo of nearly every software package ever?

Maintenance costs time and money. Taking on liability for something also comes with a price premium. Who's going to pay for those costs? Hint: It's going to be end-users that benefit from said software, not "the wider software community" (which will at best just pass any costs through)

Problem with the concept of "feature-complete"

Posted Jan 19, 2026 15:15 UTC (Mon) by farnz (subscriber, #17727) [Link]

Step 2 is either to arrange maintenance for the software you care about - whether you do it yourself, or pay someone to do it for you doesn't matter - or to accept that the software is at risk of having problems in the future, and you're going to have to address those problems when they come up one way or another. Addressing those problems, in turn, could be "I'll fix them when I know about them", but it could also be "if they hit, I'll accept my system being wiped - my fault for not maintaining things properly".

The key is to stop imagining that non-trivial software can practically be "finished and safe to use as-is forever". Either it's being maintained, and therefore the maintenance work will keep it in "safe to use" condition, or it's not being maintained, and you're at risk of finding a critical bug that means you can no longer safely use it.

It also changes how product liability law in the EU sees offers of free stuff (and is part of why early CRA drafts got things so badly wrong for Free and open source software). Gifting unfinished things is something that incurs liability in part to stop manufacturers having the bright idea of selling you just enough of the product that the free bit is useless without the paid-for bit, while gifting you the rest so it's not part of the paid-for bits. In contrast, it's understood by product liability law that gifting you something that's in what I considered working order at the time I gifted it does not put me on the hook for providing free maintenance into the future.

Maybe a hint?

Posted Jan 16, 2026 13:43 UTC (Fri) by koflerdavid (subscriber, #176408) [Link] (5 responses)

If the tool will be needed for longer than about five years (just ballparking that number) then you really need to consider the software lifecycle of your dependencies as well.

Maybe a hint?

Posted Jan 16, 2026 14:08 UTC (Fri) by pizza (subscriber, #46) [Link] (2 responses)

> If the tool will be needed for longer than about five years (just ballparking that number) then you really need to consider the software lifecycle of your dependencies as well.

The mistake you (and far too many others) keep making is by treating "some random person wrote some software and released it on the internet" as a binding commitment to perpetually support and maintain said software. Worse yet, do so in accordance with <formal software engineering process>. Even worse yet, do so for free.

Maybe a hint?

Posted Jan 22, 2026 10:31 UTC (Thu) by davidgerard (guest, #100304) [Link] (1 responses)

The case at hand is Red Hat, not a single volunteer FOSS developer.

Maybe a hint?

Posted Jan 22, 2026 11:02 UTC (Thu) by farnz (subscriber, #17727) [Link]

If it's Red Hat, then you should be able to negotiate longer support via your support contract with Red Hat.

Else, Red Hat is just another volunteer FOSS developer. They might be a big one, thanks to the money they make selling support contracts for FOSS, but unless you're paying them to fix things for you, they're a volunteer.

Maybe a hint?

Posted Jan 16, 2026 14:25 UTC (Fri) by pizza (subscriber, #46) [Link]

> you really need to consider the software lifecycle of your dependencies as well.

In the case of GTK2, that was *eighteen years*.

Goes to show you that no matter how the support period for something is, it will never be long enough.

Maybe a hint?

Posted Jan 23, 2026 19:32 UTC (Fri) by rgmoore (✭ supporter ✭, #75) [Link]

The design lifespan and actual lifespan frequently don't line up very well. As the saying goes, there is nothing more permanent than a temporary solution. The quick and dirty solution turns out to be good enough and is never replaced with the well defined, perfectly executed follow up. We've all seen it happen, so we shouldn't be surprised when it happens again.

I think what this actually shows is that our intuition- or my intuition, at least- about the lifecycle of dependencies is backward. My intuition is that big important projects need to focus on stable dependencies, while quick and dirty ones can use whatever is handy. In reality, big, well-maintained projects can afford to use less stable dependencies because they have the resources to deal with things changing under them. It's the quick and dirty projects that need to be built on the most stable base, because they're going to have to keep working without assistance.

Maybe a hint?

Posted Jan 16, 2026 17:23 UTC (Fri) by fraetor (subscriber, #161147) [Link]

Often times I think it often relates to how things are funded.

In research software (my area) the majority of funding is allocated to time bounded projects. This often leads to a new bit of software being written for a specific purpose, which will then be minimally maintained for the next several years. If it is still being used but becomes difficult to run, then a new project usually ends up forming to implement a new version, which often will change several things to reflect newer requirements.

This probably isn't helped by research organisation often being used to running very old code, as often it is needed to reproduce results from a previous study.

Maybe a hint?

Posted Jan 15, 2026 5:45 UTC (Thu) by felixfix (subscriber, #242) [Link]

> once that problem is solved they see the program as finished

Maybe it's a basic human instinct. I get annoyed by repetitive tasks like having to take time to eat, even though I do like eating. Every time I have to pee, it annoys me to waste the minute or so; didn't I do this just a few hours ago? Taking out the garbage, cleaning kitchen counters, vacuuming, dusting, buying groceries, all annoy me more for their repetitiveness than anything else. Even checking LWN once a day is annoying for a split second; I did this yesterday!

Maybe a hint?

Posted Jan 15, 2026 10:43 UTC (Thu) by lunaryorn (subscriber, #111088) [Link] (16 responses)

I don't think Gtk did breaking releases just for the sake of it.

Gtk 2 was released in 2002, and it's reasonable to assume that its entire architecture modeled contemporary hardware. Let's remember how computers looked back then: No touch input, no multi-DPI setups, no HiDPI, few plug and play setups, no compositing, etc. I don't think we even had hardware accelerated 2D drawing back then. That was a time when you had to reboot your computer to plug a new mouse.

From this perspective I find it amazing that Gtk 2 actually managed to hold out for twenty years.

It's perhaps also worth noting that in between Gtk 2 and Gtk 4 Qt, which has orders of magnitude more resources to maintain backwards compatibility, did three breaking major releases. Qt 3 was released end of 2001, and now we're at Qt 6. And that doesn't even consider the vast amount of changes C++ has seen since 2002.

Maybe a hint?

Posted Jan 15, 2026 14:07 UTC (Thu) by geert (subscriber, #98403) [Link] (7 responses)

Sure Linux had hardware accelerated 2D drawing in 2002! FWIW, Precision Insight demoed hardware accelerated 3D in 1999.

I didn't dig too deep, but even in 1994, XFree86 supported hardware acceleration on graphics chipsets from eight vendors:
https://www.nic.funet.fi/pub/linux/doc/html/install-guide...

Maybe a hint?

Posted Jan 15, 2026 14:43 UTC (Thu) by paulj (subscriber, #341) [Link]

There was hardware 3d before that I think. John Carmack was working on Matrox G100 / G200 drivers for Linux in the late 90s (was that the start of Linux DRI? Or GLX? I don't remember the project(s)). 2D acceleration was definitely well established. The Matrox Mystique, Millenium and S3 Virge and some others had 2D accells.

Maybe a hint?

Posted Jan 15, 2026 14:51 UTC (Thu) by LionsPhil (subscriber, #121073) [Link] (5 responses)

Additionally, by 2002, USB was pretty mature for keyboards and mice.

A helpful snarky way to remember how old USB is is that the "plugging in a USB printer causing a BSOD on stage" Windows gaffe was showcasing new '98 features. :)

Maybe a hint?

Posted Jan 16, 2026 7:54 UTC (Fri) by lunaryorn (subscriber, #111088) [Link] (4 responses)

I believe the story on Linux is a bit different. According to Wikipedia Linux only got proper USB support with kernel 2.4 in 2001, and X.org didn't support hotplugging until much later (Xorg 1.4 in 2007). So presumably, when Gtk 2.0 was released in 2002, USB input was the hot new thing on Linux, and hotplugging was not a thing for years to come.

Maybe a hint?

Posted Jan 16, 2026 8:24 UTC (Fri) by mjg59 (subscriber, #23239) [Link] (3 responses)

That's a little misleading - support for USB input devices had landed in fairly early 2.2 (my recollection is that Linus got irritated with the existing rudimentary USB support and replaced it with something he wrote, but this is my recollection from over 25 years ago - but I certainly had Linux running on USB-based Macs before 2.4 came out), and while hotplug wasn't directly supported by X, all mouse input was multiplexed into /dev/mice and all keyboard input went to the current terminal, so hotplug devices would just about work as long as X was configured appropriately.

Maybe a hint?

Posted Jan 16, 2026 17:53 UTC (Fri) by rgmoore (✭ supporter ✭, #75) [Link] (2 responses)

I think the big thing that came with 2.4 was devfs, which made hot plug practical. In 2.2, /dev was static and had to be populated with every conceivable device when it was created, which wasn't really compatible with hotplug. With devfs, /dev became dynamic and could add new devices as they were connected. I certainly remember following 2.4 development very carefully because USB support was insufficient for ordinary desktop use. Fore example, I had to use my fancy new optical mouse through a USB-to-PS2 converter because USB support wasn't good enough.

Maybe a hint?

Posted Jan 19, 2026 10:27 UTC (Mon) by taladar (subscriber, #68407) [Link] (1 responses)

Devfs didn't last very long though before it was replaced by udev.

Maybe a hint?

Posted Jan 19, 2026 19:15 UTC (Mon) by Cyberax (✭ supporter ✭, #52523) [Link]

And eventually got replaced back by the new devfs :)

I actually used devfs way into 2010 on my own device (a RouterBoard) by porting the devfs. It was easier with a minimal userspace than udev.

Maybe a hint?

Posted Jan 15, 2026 20:32 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link]

The low level of GTK3 was mostly fine. It was the overall UI/UX experience of GNOME3 that went off the rails.

They threw away literally EVERYTHING and had this "dynamic activities" model: https://youtu.be/S0lIpCntwv8?t=309

Why do some want to get rid of GTK2 and all applications that depend on it?

Posted Jan 17, 2026 19:42 UTC (Sat) by anton (subscriber, #25547) [Link] (6 responses)

Touch input: None of my devices have touch input. And why would adding touch input cause a backwards-incompatible change?

Multi-DPI setups: That certainly existed in 2002. In any case, why would it cause a backwards-incompatible change?

HiDPI: Why would that cause a backwards-incompatible change?

Plug and play was a buzzword in the 1990s, which PCI clearly having it and ISA gaining PnP features. By 2002 PnP was universally available. I don't see that this has any relevance GTK. Maybe you are thinking of hotplugging, which has been available since at least USB (introduced 1996, and also widely available in 2002). Why would that cause a backwards-incompatible change?

Compositing: I don't know what that's about, but why would it cause a backwards-incompatible change?

2D acceleration became important with Windows in the 1990s, and in 1994 or so the 2D-accelerated S3 928 was the thing to have, and in 1995 the Matrox Millenium was introduced and was the next thing to have (and I bought it). By the end of the 1990s, 3D acceleration became available, and I bought a Voodoo 3 3000 in 1998. By 2002, 3D acceleration was widely available. In any case, why would 2D or 3D acceleration cause a backwards-incompatible change to GTK?

Plugging in a mouse with a serial interface does not require rebooting. Plugging or unplugging a PS/2 mouse is supposedly safer if you power down the computer (we did some live unplugging and plugging and never had a hardware failure from that, though). Plugging or unplugging a USB mouse does not require rebooting. USB has available since 1996.

And 32-bit colours, which you did not mention were also available in 2002 (already the Matrox Millenium supported that).

What are reasons for backwards-incompatible changes?

  • You need to support something that does not fit in the existing interface, and where you cannot provide the existing interface in addition to the new one. Did this happen with GTK3 and GTK4? One would hope that they learn from that and make the interfaces sufficiently flexible that this does not happen again. One would hope that they have learned that lesson by GTK2.

    Contrasting example: The Linux kernel was released in 1991, when many of the hardware features you mentioned really were not there yet, yet there is no backwards-incomatible Linux2, Linux3, or Linux4 (admittedly, the removal of a.out in Linux 5.1 eliminated backwards compatibility with binaries created up to 1998).

  • You have a cool new idea for the interface to your clients, that you want to pursue, and you don't want to do the footwork to integrate it in a backwards-compatible way. From the descriptions I have read here, that's what happened with GTK3, and again with GTK4. That's ok, but then you need to be aware that you have now created an additional maintenance cost to someone: Either someone maintains the old and all the cool-new-idea interfaces, or the clients of your library will need to be rewritten for one of your cool new interfaces at some point, or these clients will become unusable (a cost to the users of these applications).
That being said, what is broken with GTK2 that some people want to eradicate it and all the applications that depend on it?

Why do some want to get rid of GTK2 and all applications that depend on it?

Posted Jan 17, 2026 20:08 UTC (Sat) by pizza (subscriber, #46) [Link] (5 responses)

> Touch input: None of my devices have touch input. And why would adding touch input cause a backwards-incompatible change?

You are in a tiny, tiny, tiny minority.

But it's not that "touch input" is backwards-incompatible in of itself, but rather that touch-oriented UIs need to be designed differently to be effective.

> Multi-DPI setups: That certainly existed in 2002. In any case, why would it cause a backwards-incompatible change?

No, the entire system ran at a single DPI. And $deity help you if you tried to change it from 96dpi.

> HiDPI: Why would that cause a backwards-incompatible change?

Because traditionally, the goal of higher resolution was to cram more onto the screen (making things smaller for a given screen size), whereas "HiDPI" is about keeping visual elements the same perceptual size on different resolution (but identically sized) screens.

The reason it's not "backwards compatible" is that this requires your UI layouts to be specified in resolution-independent units (as opposed to, say, "pixels").

> That being said, what is broken with GTK2 that some people want to eradicate it and all the applications that depend on it?

Other than the minor detail that many(most?) GTK2 applications directly rely on X11-isms.. like having a fixed dpi for all UI elements, there's the more fundmental problem about how GTK2 has been unmaintained for over five years. It turns out that folks complaining aren't stepping up to maintain anything.

Why do some want to get rid of GTK2 and all applications that depend on it?

Posted Jan 17, 2026 20:38 UTC (Sat) by Wol (subscriber, #4433) [Link]

> But it's not that "touch input" is backwards-incompatible in of itself, but rather that touch-oriented UIs need to be designed differently to be effective.

And that many devices are now "touch only". Personally, I think that's awful, and my windows computers are configured to disable touch if a mouse is present (which it almost always is). I wish I knew how to do it on linux.

But at the end of the day, we now live in a world where "touch only" is the norm :-(

Cheers,
Wol

Why do some want to get rid of GTK2 and all applications that depend on it?

Posted Jan 17, 2026 23:06 UTC (Sat) by anton (subscriber, #25547) [Link] (3 responses)

You are in a tiny, tiny, tiny minority.
As demonstrated by the resounding success of Windows 8 (pre 8.1) designed for touch, right?
But it's not that "touch input" is backwards-incompatible in of itself, but rather that touch-oriented UIs need to be designed differently to be effective.
We have a lot of applications that are designed for being effective with a mouse, and are not going to be redesigned. They work for those who use a mouse. Why is there the drive to get rid of them?
No, the entire system ran at a single DPI. And $deity help you if you tried to change it from 96dpi.
X11 fonts (bitmap fonts, not faces) come in 75dpi and 100dpi sizes (and that has already been so around 1990, i.e., long before GTK2), and I can tell every application instance which font it should use. And I certainly had 107 dpi on my laptop screen (1024x768 on a 12" screen) and ~10 dpi on the 120" screen that the projector displayed on. I did not need $deity, it all worked fine.
"HiDPI" is about keeping visual elements the same perceptual size on different resolution (but identically sized) screens.

The reason it's not "backwards compatible" is that this requires your UI layouts to be specified in resolution-independent units (as opposed to, say, "pixels").

Such units may be a good idead if you have a clean slate, but if you have a legacy of applications to support, then the idea of applying a scale factor to UI elements looks better to me (and I have seen options for setting such scale factors in various GUIs).

Of course the clean slate looks very desirable to programmers compared to the mess of dealing with backwards compatibility, but for a library it means abandoning you client base, and making it clear to all prospective clients that they better steer clear of your library.

Experience tells us that GTK4 and Wayland will be abandoned when the next shiny cool idea comes around.

there's the more fundmental problem about how GTK2 has been unmaintained for over five years.
And the problem with that is what? I use lots of software that has been unmaintained for over five years.

Why do some want to get rid of GTK2 and all applications that depend on it?

Posted Jan 18, 2026 4:20 UTC (Sun) by pizza (subscriber, #46) [Link] (2 responses)

> As demonstrated by the resounding success of Windows 8 (pre 8.1) designed for touch, right?

We're up to Windows 11 now; the overwhelming majority of the devices sold with it have touch screens.

> We have a lot of applications that are designed for being effective with a mouse, and are not going to be redesigned. They work for those who use a mouse. Why is there the drive to get rid of them?

You mean besides the fact that the overwhelming majority of devices sold for the past decade lack any other input mechanism other than a touchscreen, and even if you plug a physical keyboard+mouse into one, they can't run those applications anyway?

> X11 fonts (bitmap fonts, not faces) come in 75dpi and 100dpi sizes (and that has already been so around 1990, i.e., long before GTK2), and I can tell every application instance which font it should use. And I certainly had 107 dpi on my laptop screen (1024x768 on a 12" screen) and ~10 dpi on the 120" screen that the projector displayed on. I did not need $deity, it all worked fine.

Your projector and laptop screen both claimed to be 96dpi as far as X and all of its applications are concerned. Change that at your own peril; nearly every X11-native application will break because non-font elements will not scale, resulting in text that either overflows or is truncated by its bounding box. (most notably in menu bars, single-line input forms and labels). Ironically the applications that don't horribly break bypass X11 font rendering (along with most other X11 primitives) entirely, relying instead on client-side rendering and just slinging the resultant pixmaps to the X server. (In other words, the same paradigm that Wayland is built around)

> Such units may be a good idead if you have a clean slate, but if you have a legacy of applications to support, then the idea of applying a scale factor to UI elements looks better to me (and I have seen options for setting such scale factors in various GUIs).

Congratulations, you just answered your own question about why toolkit APIs needed to be restructured. And again, there's not currently a way to have X11 apply a blanket scale factor on an application-by-application (much less element-by-element) basis.. because dpi is a global, immutable attribute.

> Experience tells us that GTK4 and Wayland will be abandoned when the next shiny cool idea comes around.

Experience tells us that GTK4 will be directly supported for at least a decade, and closer to two. Meanwhile, while Wayland won't last forever, it will remain relevant for at least the next two decades due to numerous long-support-lifecycle industries (eg automotive) basing their software stacks around it.

Why do some want to get rid of GTK2 and all applications that depend on it?

Posted Jan 19, 2026 4:37 UTC (Mon) by jcelerier (guest, #181931) [Link] (1 responses)

> Your projector and laptop screen both claimed to be 96dpi as far as X and all of its applications are concerned. Change that at your own peril; nearly every X11-native application will break because non-font elements will not scale, resulting in text that either overflows or is truncated by its bounding box. (most notably in menu bars, single-line input forms and labels). Ironically the applications that don't horribly break bypass X11 font rendering (along with most other X11 primitives) entirely, relying instead on client-side rendering and just slinging the resultant pixmaps to the X server. (In other words, the same paradigm that Wayland is built around)

I've been using Xft.dpi to adjust DPI since 2014 and it's always worked for me, even for apps that AFAIK end up calling raw X11 drawing primitives (for instance PureData which uses TCL/Tk). Do you have example of broken apps?

Why do some want to get rid of GTK2 and all applications that depend on it?

Posted Jan 19, 2026 12:47 UTC (Mon) by pizza (subscriber, #46) [Link]

> I've been using Xft.dpi to adjust DPI since 2014 and it's always worked for me,

Xft.dpi is purely for font scaling, it leaves the X server's core dpi setting unchanged. It also only affects truetype (==scaleable) font rendering, not "classic" X11 fonts which are fixed bitmaps.

See: https://unix.stackexchange.com/questions/596765/is-x-dpi-...

Maybe a hint?

Posted Jan 15, 2026 9:37 UTC (Thu) by kleptog (subscriber, #1183) [Link] (2 responses)

> At least at my place of work, GTK2 is used for a lot of internal applications that nowadays would be written with web interfaces. Many of these applications are "finished", so there is little appetite to spend effort upgrading toolkits.

Honestly, these kinds of projects are better served by just creating a Dockerfile that builds the project in the last Debian release where it worked and just shipping that. Then you have compatibility until the kernel interfaces change which is a lot longer. For internal applications security updates can only break things.

In theory, people could migrate all those old applications to something like Flatpak distribution, then the problem goes away.

Maybe a hint?

Posted Jan 15, 2026 16:05 UTC (Thu) by LtWorf (subscriber, #124958) [Link] (1 responses)

That only works until your wayland compositor remains X11 compatible, which is probably much shorter.

Also the kernel breaking compatibility does happen.

Maybe a hint?

Posted Jan 15, 2026 17:32 UTC (Thu) by Tarnyko (subscriber, #90061) [Link]

Fair statement, but I doubt Xwayland is leaving anytime soon.

As was already said, such an app would benefit from a proper .AppImage version; it should work everywhere if done correctly.

Maybe a hint?

Posted Jan 14, 2026 18:00 UTC (Wed) by LionsPhil (subscriber, #121073) [Link] (1 responses)

GTK3 and onward made a fairly significant design pivots in places IIRC. To the project's credit, they have documentation covering both migrations, but you can see it is not a "bump the version number and swat at any deprecation warnings" kind of weekend task if your application did anything nontrivial:

https://docs.gtk.org/gtk3/migrating-2to3.html
https://docs.gtk.org/gtk4/migrating-3to4.html

...along with that, it's more or less impossible *not* to change the UX in the process. Not just from theme breakage (that ship's largely sailed, unfortunately), but from UI patterns (like how the modern GTK/GNOME world likes putting primary buttons in titlebars, and will start doing so if you use the stock file chooser).

Maybe a hint?

Posted Jan 17, 2026 1:20 UTC (Sat) by mirabilos (subscriber, #84359) [Link]

My window manager does not even *have* title bars…

… and Gtk+3 programs are so ugly, compared to even Gtk+2, which I already was no fan of.


Copyright © 2026, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds