|
|
Log in / Subscribe / Register

Debian discusses removing GTK 2 for forky

By Joe Brockmeier
January 14, 2026

The Debian GNOME team would like to remove the GTK 2 graphics toolkit, which has been unmaintained upstream for more than five years, and ship Debian 14 ("forky") without it. As one might expect, however, there are those who would like to find a way to keep it. Despite its age and declared obsolescence, quite a few Debian packages still depend on GTK 2. Many of those applications are unlikely to be updated, and users are not eager to give them up. Discussion about how to handle this is ongoing; it seems likely that Debian developers will find some way to continue supporting applications that require GTK 2, but users may have to look outside official Debian repositories.

GTK 2 was released in 2002 and was declared end of life with the release of GTK 4 on December 16, 2020; the final release, 2.24.33, was published a few days later. The GTK project currently maintains two stable branches—GTK 3.x ("oldstable") and GTK 4.x ("stable"). The GTK 3.x branch will be maintained until the project releases GTK 5, and the project has not yet announced any firm plans for such a release.

On January 7, Matthias Geiger announced that Debian's GNOME team has a goal of removing GTK 2 from forky before it is released in 2027; in addition to being unmaintained, he said, it lacks native Wayland support and features needed for HiDPI displays.

Geiger pointed out that Debian would not be alone in dropping GTK 2, as Arch Linux and Red Hat Enterprise Linux (RHEL) have already done so. Arch moved the gtk2 package and those that depend on it out of the official repositories and into the Arch User Repository (AUR) in October 2025, and RHEL dropped support for GTK 2 with the release of RHEL 10 in May 2025. It might be worth noting, however, that Red Hat will still be on the hook to support GTK 2 in RHEL 9 through 2032, and it is still packaged for current Fedora releases as well as EPEL 10.

Developers maintaining packages with GTK 2 dependencies have had ample time to consider options and alternatives; Simon McVittie reported bugs against packages that still depended on GTK 2 in April 2020, about eight months before it went end of life. At that time more than 640 packages still relied on it, so he also started a discussion about "minimizing the amount of GTK 2 in the archive", though he acknowledged the difficulty in getting rid of it entirely:

GTK 2 is used by some important productivity applications like GIMP, and has also historically been a popular UI toolkit for proprietary software that we can't change, so perhaps removing GTK 2 from Debian will never be feasible. However, it has definitely reached the point where a dependency on it is a bug - not a release-critical bug, and not a bug that can necessarily be fixed quickly, but a piece of technical debt that maintainers should be aware of.

Now, almost six years later, there are just slightly more than 150 packages that carry a dependency on GTK 2; far fewer than in 2020, but still a significant number. GIMP, for example, updated to GTK 3 with its 3.0 release in March 2025. And, as Geiger noted, one of the blockers to ridding Debian of GTK 2 entirely is the fact that Debian's graphical installer still depends on it.

Jonathan Dowland said that the Debian GNOME team should not have to maintain GTK 2 if it does not want to. But, he argued, the correct thing for the team to do is to orphan the package to see if others are willing to maintain it.

I respect your opinion that Debian would be better off without GTK2 in the archive. However, I don't agree with it. The two pillars of my position are: removing this forces the removal of useful dependent programs in the archive which have active users; it also makes it more difficult for users to run dependent programs *outside* the archive, including software of historical significance. IMHO this falls foul of [Debian Social Contract section 4].

The section in Debian's Social Contract that Dowland refers to is "our priorities are our users and free software". It states that Debian will place the needs of users first, and not object to non-free works intended to be used on Debian systems. Since there are non-free works that depend on GTK 2, and are unlikely to be ported to later versions, one could argue maintaining the toolkit is in users' best interest.

Some would disagree with that, though. Emilio Pozuelo Monfort said "with my Release Team hat on" that it would be a disservice to keep shipping GTK 2 in new releases, since it has been dead upstream so long:

We don't ship every old library just because someone could make use of it. There is a maintenance cost to that. See e.g. QT4, libsdl1.2 and so many others that could been have kept for similar reasons. Perhaps those old packages also need us to ship GCC 5 or an old cmake. That's a slippery slope.

He suggested that there was still time to port useful packages to GTK 3 or GTK 4. Dowland shot that idea down, though; some of the applications, such as Hexchat, were not likely to be ported to a new version, ever. He also noted that later versions of GTK were "simply not equivalent" and would require fundamental design changes. Some applications are still using GTK 2, he suspected, because the developers would rather not use later versions.

Dowland also said that it was important to recognize that Debian does not "have a cast-iron rule that we apply even-handedly to every library". There is no Debian policy that developers can simply point to as guidance for GTK 2 or other libraries that have reached the end of life. Gioele Barabucci put out some ideas about conditions for keeping legacy libraries, which sparked a brief discussion about the security concerns related to forks of unmaintained software.

Outside Debian

One possibility that Geiger raised would be to move GTK 2 packages to Debian's Debusine instance, which is open to all Debian developers and maintainers. Debusine is a project developed by Freexian; it provides tools for developing Debian derivatives, including package building and hosting APT repositories. In December, Colin Watson announced that Debian's instance would allow Debian developers and maintainers to create "APT-compatible add-on package repositories" similar to Ubuntu's personal package archives (PPAs). Geiger suggested that GTK 2 and any dependent packages could be moved to Debusine rather than keeping them in Debian's official archives.

Another option would be to create an upstream fork of GTK 2 and package it in Debian. Adam Sampson observed that the Ardour digital-audio-workstation project has created its own fork of the toolkit. However, it is unclear that the project has any interest in maintaining a generic fork of GTK 2 suitable for use beyond its needs for Ardour's user interface.

McVittie discouraged the idea of maintaining a fork of the toolkit. He argued that software that no longer had an active upstream "seems to have a tendency to soak up a disproportionate amount of time outside the immediate package". He also raised the option of Debusine as a way out of keeping GTK 2 in Debian and noted that would be similar to Arch moving it to the AUR.

Time marches on (and on)

The discussion is still ongoing. Dowland expressed optimism that the issue would be resolved in due course. What the solution looks like is still to be determined, but it seems likely forky users will have some way to obtain GTK 2 if necessary.

A broader solution that applies beyond GTK 2 might be in order, though. The scenario of "such-and-such is obsolete, unmaintained, and at the end of its life" has been popping up often over the past few years; and it will continue to do so with increasing frequency. As time goes on, the pile of "old" software (and hardware) that is still in use will continue to grow—and it will grow faster than the number of people interested in doing the work of porting to new libraries just because older libraries have been abandoned upstream.



to post comments

Maybe a hint?

Posted Jan 14, 2026 16:44 UTC (Wed) by wtarreau (subscriber, #51152) [Link] (95 responses)

I don't follow such versions but I didn't remember having ever noticed that there was anything newer than gtk2 in dependencies, even though I'm now seeing them on my system. But if gtk3 was released in 2011 and Gimp itself, where all of this originated from, only adopted it 14 years later, while gtk4 had been available for 5 years already, surely this is a hint that something there probably doesn't perfectly match users' expectations, whether it's related to massive code changes, a different architecture or anything. So actually the problem might be the newer versions themselves if programs don't adopt them for that long.

Maybe a hint?

Posted Jan 14, 2026 16:59 UTC (Wed) by Wol (subscriber, #4433) [Link] (61 responses)

> So actually the problem might be the newer versions themselves if programs don't adopt them for that long.

Or there's an imbalance between resources available to GTK and other projects? It wouldn't surprise me if Gnome is throwing a lot of resource at GTK. I get the impression that there are only a couple of people actually working on gimp?

That's not to say upgrading your toolkit version isn't a good idea. It's just that projects may have better things to do with their limited time, than to climb on board the upgrade treadmill.

Cheers,
Wol

Maybe a hint?

Posted Jan 14, 2026 17:29 UTC (Wed) by pizza (subscriber, #46) [Link] (58 responses)

> Or there's an imbalance between resources available to GTK and other projects?

It's basically "just" this.

[barely-]unmaintained applications written against GTK2 (or heck, GTK1) don't have the resources to port to something more recent.

Maybe a hint?

Posted Jan 14, 2026 22:10 UTC (Wed) by fraetor (subscriber, #161147) [Link] (57 responses)

In retrospect, its not entirely surprising that GTK2 apps lack the resources to be upgraded. GTK2 was around for what might have been the heyday of desktop GUI programming, however by the time GTK3 was mainstream the web had improved to such an extent that newer applications were written with web interfaces instead.

At least at my place of work, GTK2 is used for a lot of internal applications that nowadays would be written with web interfaces. Many of these applications are "finished", so there is little appetite to spend effort upgrading toolkits. When there is resources to spend on an upgrade, it most typically results in a web interface, or sometimes even a TUI.

Maybe a hint?

Posted Jan 15, 2026 1:04 UTC (Thu) by rgmoore (✭ supporter ✭, #75) [Link] (53 responses)

Many of these applications are "finished", so there is little appetite to spend effort upgrading toolkits.

I think this gets at a very deep and fundamental divide that creates a lot of problems like this. Some people, especially people who develop languages and programming tools, see development as a never ending process: there is always more work to be done, new versions to be developed, etc. Other people are trying to build a tool to solve one specific problem, and once that problem is solved they see the program as finished with the exception of fixing any bugs that are discovered along the way. The people who see development as a never ending process will eventually get to the point they want to make backwards incompatible changes, which strands the people who depended on their language or tools and who considered their projects to be finished.

It's a tough problem, because either way you're trying to force somebody into doing programming work they don't want to do and didn't sign up for. Either tool and language makers are forced to continue supporting old versions they think are obsolete, or programmers are forced to constantly update programs for no real benefit to their program's functionality.

Maybe a hint?

Posted Jan 15, 2026 1:40 UTC (Thu) by pizza (subscriber, #46) [Link]

> The people who see development as a never ending process [....]

...are probably being paid (or otherwise getting some tangible benefits) for their efforts.

Whereas the ones that see their applications as "finished" (ie "solves the problem it was written for") do not.

Maybe a hint?

Posted Jan 15, 2026 3:49 UTC (Thu) by pabs (subscriber, #43278) [Link] (33 responses)

I wonder why people without lots of resources for maintenance add dependencies on projects with a reputation for fast development and backwards incompatibility in the first place.

There are better options for them like using slower-moving/finished toolkits, writing minimal custom toolkits, TUIs, CLIs or just making libraries and leaving UIs to separate projects.

Maybe a hint?

Posted Jan 15, 2026 4:16 UTC (Thu) by dskoll (subscriber, #1630) [Link] (2 responses)

Sometimes people pick what they know, or what they want to learn, or what all the cool kids are using. And they don't think beyond that.

I have some GUI programs written in Tcl/Tk which is used nowadays by almost nobody and people generally scoff about... but my programs written in 2002 still work perfectly with modern Tcl/Tk.

Maybe a hint?

Posted Jan 15, 2026 7:02 UTC (Thu) by pabs (subscriber, #43278) [Link]

I love that `git gui` and `gitk` are written in Tcl/Tk because it means they are very unlikely to go away.

Tcl/Tk

Posted Jan 25, 2026 21:30 UTC (Sun) by pschneider1968 (guest, #178654) [Link]

FWIW, Tcl/Tk is still used heavily in the Scid Chess DB software (see https://scid.sourceforge.net/ ) which I use on an almost daily basis to enter, analyze and maintain my chess games. It's a great piece of software, and the most viable FOSS alternative to the proprietary and costly Chessbase.

Maybe a hint?

Posted Jan 15, 2026 12:20 UTC (Thu) by pizza (subscriber, #46) [Link] (3 responses)

>I wonder why people without lots of resources for maintenance add dependencies on projects with a reputation for fast development and backwards incompatibility in the first place.

Uh, GTK2's (directly supported) lifecycle was just shy of two decades. Actively-supported GTK3 is well over a decade into its lifecycle, and GTK4 is about five years in.

If that is "fast development" then I'd hate to see what you call anything its more "modern" contemporaries.

Maybe a hint?

Posted Jan 16, 2026 13:20 UTC (Fri) by wtarreau (subscriber, #51152) [Link] (2 responses)

Maybe what's missing is a "gtk2-on-3" wrapper that presents gtk2 API and semantics from gtk3 ?

GTK-2-on-3 wrapper

Posted Jan 17, 2026 8:33 UTC (Sat) by gioele (subscriber, #61675) [Link] (1 responses)

> Maybe what's missing is a "gtk2-on-3" wrapper that presents gtk2 API and semantics from gtk3 ?

That's kind of hard to do because GKT 2 and GTK 3 are paradigmatically different. It is not just a matter of updating a couple of functions from a deprecated API to a new one. For example the base "class" GtkObject is gone and pretty much all APIs related to custom widgets or surface drawing have been removed and replaced by Cairo.

And even if a GTK 2 to GTK 3 wrapper were possible, a GTK 3 to GTK 4 is categorically impossible (or possible only in very limited cases) because the API for the main event loop has changed. (And all the styling is fundamentally different, so an application would look seriously broken with widgets all over the place.)

A GTK 2-to-3 wrapper only would buy us a few years before GTK 3 is fully abandoned upstream and we are back to the same discussion.

Compatibility layers

Posted Jan 18, 2026 9:39 UTC (Sun) by swilmet (subscriber, #98424) [Link]

The GTK project removes *all* deprecated API in the next major version. So to ease the port, compatibility layers could be created for some features.

For example GtkTreeView (tree and list widget, quite a complex beast) is deprecated in GTK 4, so it'll go away in GTK 5. GtkTreeView could be maintained in a separate module to still provide that API for GTK 5.

The same for GtkUIManager (deprecated in GTK 3 and so removed from GTK 4). It could be maintained for GTK 4 as a compatibility layer.

Perhaps it's time to create a gtk-legacy module. Classic desktop environments (Cinnamon and its X-apps, or MATE) would benefit from it.

Another, complementary way would be a gtk-future module, back-porting some new APIs to a previous major version.

Maybe a hint?

Posted Jan 15, 2026 14:03 UTC (Thu) by epa (subscriber, #39769) [Link]

Back in the day GTK 2 was arguably the standard user interface toolkit in many Linux environments (broadly speaking those aligned with GNOME rather than KDE). It would have seemed like a reasonable choice if you wanted some kind of "enterprise" support, since Red Hat's own installer (Anaconda) used it, as well as many of the bundled desktop applications, even if the big names like Firefox and Libreoffice went their own way.

As I recall it the GTK 3 release was accompanied by a big announcement that it would guarantee backwards compatibility, though only in the future, of course.

Maybe a hint?

Posted Jan 15, 2026 18:38 UTC (Thu) by rgmoore (✭ supporter ✭, #75) [Link] (24 responses)

I think the same attitude that encourages people to think about projects as finished means they don't think very hard about long-term maintainability. Their goal is to write a program that achieves their goal, and then they'll move on to the next project. They're going to pick whatever language or tool makes the job of writing their program easy, and they never think about where they're going to be in a year, much less 10.

One could just as easily ask why someone developing an environment for other people to use is so careless with those people's time. Every time they deprecate an API because it's too hard to maintain, they're pushing the work of adapting to the new API onto their downstream. The effort required by all their downstream users adapting to the new API is probably many times greater than the effort required to maintain it, so it's a huge net loss. There may be some times when breaking an API is still necessary- I think the change in strings between Python 2 and 3 is probably a good example- but language and library writers need to think long and hard before making breaking changes.

Maybe a hint?

Posted Jan 15, 2026 18:58 UTC (Thu) by pizza (subscriber, #46) [Link] (6 responses)

> I think the same attitude that encourages people to think about projects as finished means they don't think very hard about long-term maintainability.

Or perhaps one should see this from a different perspective -- "I wrote this software to solve an immediate problem that I had, and I'm providing it AS-IS WITH NO WARRANTY WHATSOEVER. Anyone expecting or demanding that I continually update or otherwise maintain this software until the day I die can perform an anatomically improbable act with themselves."

> They're going to pick whatever language or tool makes the job of writing their program easy, and they never think about where they're going to be in a year, much less 10.

....welcome to human nature?

Maybe a hint?

Posted Jan 15, 2026 23:18 UTC (Thu) by rgmoore (✭ supporter ✭, #75) [Link] (5 responses)

I don't mean to sound overly critical! I've written software that is still in use long, long after I ever expected it to remain in service, and it's 100% luck that I wrote it in a language that hasn't required endless maintenance since then. I'm definitely a member of the crew who wants to write it and forget it. I wish that we could stop the constant churn of underlying technologies that drive the development churn. I understand that we can't stop this stuff completely- we do learn things that need to be incorporated into language and tool design- but there seems to be an awful lot that is driven by software companies' desire to sell new licenses rather than anything that genuinely needs updating.

Maybe a hint?

Posted Jan 18, 2026 9:50 UTC (Sun) by swilmet (subscriber, #98424) [Link] (4 responses)

In GTK's case, the deprecations/removals are because there are just a handful of GTK core developers/maintainers. They cannot reasonably maintain forever all APIs, which is understandable.

However as noted in an earlier comment, a "gtk-legacy" and "gtk-future" modules could be created to provide partial compatibility layers.

Maybe a hint?

Posted Jan 18, 2026 12:00 UTC (Sun) by pizza (subscriber, #46) [Link] (3 responses)

> In GTK's case, the deprecations/removals are because there are just a handful of GTK core developers/maintainers. They cannot reasonably maintain forever all APIs, which is understandable.

Absolutely.

>However as noted in an earlier comment, a "gtk-legacy" and "gtk-future" modules could be created to provide partial compatibility layers.

Maintained by whom, exactly? The same folks that "understandably" removed all of the old APIs because they don't have the resources to "reasonably maintain them forever"?

If that maintenance is by some other mythical entity, what stops them from simply maintaining GTK2 in its current state?

Maybe a hint?

Posted Jan 19, 2026 3:11 UTC (Mon) by swilmet (subscriber, #98424) [Link] (2 responses)

What I had in mind is that big projects like Linux Mint, MATE, Ardour, GIMP and others would coordinate some efforts to ease GTK usage outside what GTK maintainers are able to provide. What I've achieved for text editor projects is a few libgedit-* modules. For example libgedit-amtk for extending GTK 3 with an alternative API to create menus and toolbars (because GtkUIManager is deprecated in GTK 3).

Maybe a hint?

Posted Jan 19, 2026 17:08 UTC (Mon) by rgmoore (✭ supporter ✭, #75) [Link] (1 responses)

What I had in mind is that big projects like Linux Mint, MATE, Ardour, GIMP and others would coordinate some efforts to ease GTK usage outside what GTK maintainers are able to provide.

I don't know that those projects have the resources to do what you're suggesting. For example, GIMP is still in the process of fully transitioning from their old libraries to GEGL. They don't have spare developers to devote to maintaining old GTK; if they had more resources they'd want to devote them to speeding up development of their graphics stack rather than maintaining underlying libraries. I agree that maintaining those old libraries would be a worthwhile task, but I don't think projects that are already operating on a shoestring are the place to find the resources.

Maybe a hint?

Posted Jan 20, 2026 9:23 UTC (Tue) by swilmet (subscriber, #98424) [Link]

At this point we can say that probably almost all FLOSS desktop components are under-resourced, except rare exceptions (thinking about Qt, Blender, LibreOffice maybe?).

Maybe a hint?

Posted Jan 15, 2026 23:15 UTC (Thu) by Wol (subscriber, #4433) [Link] (15 responses)

> and they never think about where they're going to be in a year, much less 10.

Or maybe they HAVE thought about where they're going to be in a year, and the spec says "this software is feature-complete"?

I bang on about truth tables, but if you've done your truth table, and addressed every option, what else is there to do?

Cheers,
Wol

Problem with the concept of "feature-complete"

Posted Jan 16, 2026 12:31 UTC (Fri) by farnz (subscriber, #17727) [Link] (8 responses)

The problem with the whole idea there is that you never have a complete and fully explicit specification for any software; there's always parts that are either implicit (and that you may not know about at all, because you meet them accidentally, like "needs to have an attractive colour scheme" when you have good taste in colours), and there are pieces that are time or environment dependent (like "doesn't cost too much in terms of low-end consumer connectivity", which in 1999 meant thinking about how to minimise the number of seconds spent online, and in 2025 means thinking about how to minimise the number of bytes used).

That makes it very hard to accurately declare yourself "feature-complete", because you don't have a complete and accurate spec, and that's a requirement to declare yourself complete.

Problem with the concept of "feature-complete"

Posted Jan 16, 2026 13:59 UTC (Fri) by pizza (subscriber, #46) [Link] (7 responses)

> That makes it very hard to accurately declare yourself "feature-complete", because you don't have a complete and accurate spec, and that's a requirement to declare yourself complete.

You're over-thinking this, applying formal engineering principles to something that ... isn't.

The _actual_ spec involved: "It does what I need it to do"

Granted, there's usually "...without any problems too annoying for me to ignore" implicitly tacked on. [1]

With that in mind, yes, software is quite easily "finished" from the perspective of the person writing it.

...and as I so often point out here, if random other folks out there disagree, they can try to persuade the author to change their mind [2] [3] or fix the problem to their own satisfaction. Otherwise... <taps the sign>

[1] which in turn implies "..and/or worth the effort to fix"
[2] As opposed to "demand / berate / complain / threaten / ... "
[2] I've found that home-baked desserts with a side of cash does wonders.

Problem with the concept of "feature-complete"

Posted Jan 16, 2026 15:49 UTC (Fri) by farnz (subscriber, #17727) [Link] (6 responses)

No, I'm saying that the formal engineering principle of "fully meets the specification, therefore it's complete and needs no further work" rarely, if ever, applies to software, because the specification is itself incomplete in the ways we're both describing.

Now, you can declare that you're not working on this any more (for any reason - not just because you think it's "good enough"), and that's separate. But it's extremely rare (I cannot think of software in this state, not even TeX meets it) for you to have a complete enough specification that you can declare the software "complete against the specification".

Separate to that is the problem with people neither being able (or, in some cases, willing) to fund maintainers doing "no change that I notice", nor to accept "this stops being maintained". At some point, something has to give - either people find ways to fund maintenance that meets their needs, or they have to accept that sometimes, good things come to an end. And in some ways, that's harder with Free software, since at least with proprietary software, the usual way for good things to come to an end was for the company to go bankrupt (with rare exceptions), and thus it's clear why you can't buy it any more, while with Free software, it's people walking away from the project.

Problem with the concept of "feature-complete"

Posted Jan 16, 2026 16:20 UTC (Fri) by pizza (subscriber, #46) [Link] (5 responses)

> No, I'm saying that the formal engineering principle of "fully meets the specification, therefore it's complete and needs no further work" rarely, if ever, applies to software, because the specification is itself incomplete in the ways we're both describing.

Uh, the spec is, by definition, what determines if something has been "completed" or not.

...Whether or not the spec itself is "complete" (according to whom, exactly?) or can ever be considered truly "complete" is an entirely separate issue.

Problem with the concept of "feature-complete"

Posted Jan 16, 2026 17:39 UTC (Fri) by farnz (subscriber, #17727) [Link] (4 responses)

No - if the specification is incomplete, then you cannot say if something is completed or not, since when the gaps in the specification are filled in, you may find out that something you thought was correct against the specification was, in fact, only correct against your assumptions.

A specification is complete if there are no gaps in the specification, where a relevant part of the system is not specified, but it's assumed that people's "best efforts" will result in the required outcome happening anyway. In other words, the specification is only complete if, for all things you care about, the specification says (either directly, or by reference) what's acceptable, rather than being silent on that point, and the only things that are unspecified are things where it does not matter to you.

It's very common for specifications to be incomplete - for example, the specification for rewiring a building I worked in did not say that if all switches are in the same position, power should either be consistently off, or consistently on, because there's a local norm that means that most electricians will do that anyway, as a side effect of it being considered part of doing a good job.

And this is a big deal with software, since we have relatively few reference documents that would provide most of a spec by reference; the rewire job brought in around 700 pages of electrical specification by reference to a standard, for example, and still missed something that the building owner cared about, because he didn't even realise that it was something that "should" be specified somewhere. As it happened, there was one set of switches where doing that was relatively challenging, and the contractor didn't bother - he chose to fill in the gap in the specification with his own preferences, and the building owner had to pay to have it redone, but this time with the gap in the specification filled in.

Problem with the concept of "feature-complete"

Posted Jan 16, 2026 20:22 UTC (Fri) by pizza (subscriber, #46) [Link] (3 responses)

> No - if the specification is incomplete, then you cannot say if something is completed or not.

I'm sorry, but a "complete specification" dwells in the realm of spherical cows, and as such is utterly irrelevant for things built in the real world.

Problem with the concept of "feature-complete"

Posted Jan 18, 2026 11:57 UTC (Sun) by farnz (subscriber, #17727) [Link] (2 responses)

That's exactly my point - if you're saying that this thing is "feature complete and will never need further maintenance", you're also implying that the specification against which you claim it's "feature complete and will never need further maintenance" is also itself complete and will never need updating.

And I think we'd be a lot better off as a wider software community if we stopped trying to pretend that there is such a thing as a software project that's "completed, will never need further work", and accepted that software that's not maintained is a liability.

Of course, that doesn't force anyone to maintain it - you can abandon anything for any reason - but it does mean that if you're using unmaintained software, you need to be aware that you're taking on that liability.

Problem with the concept of "feature-complete"

Posted Jan 18, 2026 12:32 UTC (Sun) by pizza (subscriber, #46) [Link] (1 responses)

> And I think we'd be a lot better off as a wider software community if we stopped trying to pretend that there is such a thing as a software project that's "completed, will never need further work", and accepted that software that's not maintained is a liability.

Ok, congratulations... by your definition nearly all software is "unmaintained" and is therefore a "liability". What's step two?

> Of course, that doesn't force anyone to maintain it - you can abandon anything for any reason - but it does mean that if you're using unmaintained software, you need to be aware that you're taking on that liability.

In other words... the "AS-IS, NO WARRANTY WHATSOEVER" status quo of nearly every software package ever?

Maintenance costs time and money. Taking on liability for something also comes with a price premium. Who's going to pay for those costs? Hint: It's going to be end-users that benefit from said software, not "the wider software community" (which will at best just pass any costs through)

Problem with the concept of "feature-complete"

Posted Jan 19, 2026 15:15 UTC (Mon) by farnz (subscriber, #17727) [Link]

Step 2 is either to arrange maintenance for the software you care about - whether you do it yourself, or pay someone to do it for you doesn't matter - or to accept that the software is at risk of having problems in the future, and you're going to have to address those problems when they come up one way or another. Addressing those problems, in turn, could be "I'll fix them when I know about them", but it could also be "if they hit, I'll accept my system being wiped - my fault for not maintaining things properly".

The key is to stop imagining that non-trivial software can practically be "finished and safe to use as-is forever". Either it's being maintained, and therefore the maintenance work will keep it in "safe to use" condition, or it's not being maintained, and you're at risk of finding a critical bug that means you can no longer safely use it.

It also changes how product liability law in the EU sees offers of free stuff (and is part of why early CRA drafts got things so badly wrong for Free and open source software). Gifting unfinished things is something that incurs liability in part to stop manufacturers having the bright idea of selling you just enough of the product that the free bit is useless without the paid-for bit, while gifting you the rest so it's not part of the paid-for bits. In contrast, it's understood by product liability law that gifting you something that's in what I considered working order at the time I gifted it does not put me on the hook for providing free maintenance into the future.

Maybe a hint?

Posted Jan 16, 2026 13:43 UTC (Fri) by koflerdavid (subscriber, #176408) [Link] (5 responses)

If the tool will be needed for longer than about five years (just ballparking that number) then you really need to consider the software lifecycle of your dependencies as well.

Maybe a hint?

Posted Jan 16, 2026 14:08 UTC (Fri) by pizza (subscriber, #46) [Link] (2 responses)

> If the tool will be needed for longer than about five years (just ballparking that number) then you really need to consider the software lifecycle of your dependencies as well.

The mistake you (and far too many others) keep making is by treating "some random person wrote some software and released it on the internet" as a binding commitment to perpetually support and maintain said software. Worse yet, do so in accordance with <formal software engineering process>. Even worse yet, do so for free.

Maybe a hint?

Posted Jan 22, 2026 10:31 UTC (Thu) by davidgerard (guest, #100304) [Link] (1 responses)

The case at hand is Red Hat, not a single volunteer FOSS developer.

Maybe a hint?

Posted Jan 22, 2026 11:02 UTC (Thu) by farnz (subscriber, #17727) [Link]

If it's Red Hat, then you should be able to negotiate longer support via your support contract with Red Hat.

Else, Red Hat is just another volunteer FOSS developer. They might be a big one, thanks to the money they make selling support contracts for FOSS, but unless you're paying them to fix things for you, they're a volunteer.

Maybe a hint?

Posted Jan 16, 2026 14:25 UTC (Fri) by pizza (subscriber, #46) [Link]

> you really need to consider the software lifecycle of your dependencies as well.

In the case of GTK2, that was *eighteen years*.

Goes to show you that no matter how the support period for something is, it will never be long enough.

Maybe a hint?

Posted Jan 23, 2026 19:32 UTC (Fri) by rgmoore (✭ supporter ✭, #75) [Link]

The design lifespan and actual lifespan frequently don't line up very well. As the saying goes, there is nothing more permanent than a temporary solution. The quick and dirty solution turns out to be good enough and is never replaced with the well defined, perfectly executed follow up. We've all seen it happen, so we shouldn't be surprised when it happens again.

I think what this actually shows is that our intuition- or my intuition, at least- about the lifecycle of dependencies is backward. My intuition is that big important projects need to focus on stable dependencies, while quick and dirty ones can use whatever is handy. In reality, big, well-maintained projects can afford to use less stable dependencies because they have the resources to deal with things changing under them. It's the quick and dirty projects that need to be built on the most stable base, because they're going to have to keep working without assistance.

Maybe a hint?

Posted Jan 16, 2026 17:23 UTC (Fri) by fraetor (subscriber, #161147) [Link]

Often times I think it often relates to how things are funded.

In research software (my area) the majority of funding is allocated to time bounded projects. This often leads to a new bit of software being written for a specific purpose, which will then be minimally maintained for the next several years. If it is still being used but becomes difficult to run, then a new project usually ends up forming to implement a new version, which often will change several things to reflect newer requirements.

This probably isn't helped by research organisation often being used to running very old code, as often it is needed to reproduce results from a previous study.

Maybe a hint?

Posted Jan 15, 2026 5:45 UTC (Thu) by felixfix (subscriber, #242) [Link]

> once that problem is solved they see the program as finished

Maybe it's a basic human instinct. I get annoyed by repetitive tasks like having to take time to eat, even though I do like eating. Every time I have to pee, it annoys me to waste the minute or so; didn't I do this just a few hours ago? Taking out the garbage, cleaning kitchen counters, vacuuming, dusting, buying groceries, all annoy me more for their repetitiveness than anything else. Even checking LWN once a day is annoying for a split second; I did this yesterday!

Maybe a hint?

Posted Jan 15, 2026 10:43 UTC (Thu) by lunaryorn (subscriber, #111088) [Link] (16 responses)

I don't think Gtk did breaking releases just for the sake of it.

Gtk 2 was released in 2002, and it's reasonable to assume that its entire architecture modeled contemporary hardware. Let's remember how computers looked back then: No touch input, no multi-DPI setups, no HiDPI, few plug and play setups, no compositing, etc. I don't think we even had hardware accelerated 2D drawing back then. That was a time when you had to reboot your computer to plug a new mouse.

From this perspective I find it amazing that Gtk 2 actually managed to hold out for twenty years.

It's perhaps also worth noting that in between Gtk 2 and Gtk 4 Qt, which has orders of magnitude more resources to maintain backwards compatibility, did three breaking major releases. Qt 3 was released end of 2001, and now we're at Qt 6. And that doesn't even consider the vast amount of changes C++ has seen since 2002.

Maybe a hint?

Posted Jan 15, 2026 14:07 UTC (Thu) by geert (subscriber, #98403) [Link] (7 responses)

Sure Linux had hardware accelerated 2D drawing in 2002! FWIW, Precision Insight demoed hardware accelerated 3D in 1999.

I didn't dig too deep, but even in 1994, XFree86 supported hardware acceleration on graphics chipsets from eight vendors:
https://www.nic.funet.fi/pub/linux/doc/html/install-guide...

Maybe a hint?

Posted Jan 15, 2026 14:43 UTC (Thu) by paulj (subscriber, #341) [Link]

There was hardware 3d before that I think. John Carmack was working on Matrox G100 / G200 drivers for Linux in the late 90s (was that the start of Linux DRI? Or GLX? I don't remember the project(s)). 2D acceleration was definitely well established. The Matrox Mystique, Millenium and S3 Virge and some others had 2D accells.

Maybe a hint?

Posted Jan 15, 2026 14:51 UTC (Thu) by LionsPhil (subscriber, #121073) [Link] (5 responses)

Additionally, by 2002, USB was pretty mature for keyboards and mice.

A helpful snarky way to remember how old USB is is that the "plugging in a USB printer causing a BSOD on stage" Windows gaffe was showcasing new '98 features. :)

Maybe a hint?

Posted Jan 16, 2026 7:54 UTC (Fri) by lunaryorn (subscriber, #111088) [Link] (4 responses)

I believe the story on Linux is a bit different. According to Wikipedia Linux only got proper USB support with kernel 2.4 in 2001, and X.org didn't support hotplugging until much later (Xorg 1.4 in 2007). So presumably, when Gtk 2.0 was released in 2002, USB input was the hot new thing on Linux, and hotplugging was not a thing for years to come.

Maybe a hint?

Posted Jan 16, 2026 8:24 UTC (Fri) by mjg59 (subscriber, #23239) [Link] (3 responses)

That's a little misleading - support for USB input devices had landed in fairly early 2.2 (my recollection is that Linus got irritated with the existing rudimentary USB support and replaced it with something he wrote, but this is my recollection from over 25 years ago - but I certainly had Linux running on USB-based Macs before 2.4 came out), and while hotplug wasn't directly supported by X, all mouse input was multiplexed into /dev/mice and all keyboard input went to the current terminal, so hotplug devices would just about work as long as X was configured appropriately.

Maybe a hint?

Posted Jan 16, 2026 17:53 UTC (Fri) by rgmoore (✭ supporter ✭, #75) [Link] (2 responses)

I think the big thing that came with 2.4 was devfs, which made hot plug practical. In 2.2, /dev was static and had to be populated with every conceivable device when it was created, which wasn't really compatible with hotplug. With devfs, /dev became dynamic and could add new devices as they were connected. I certainly remember following 2.4 development very carefully because USB support was insufficient for ordinary desktop use. Fore example, I had to use my fancy new optical mouse through a USB-to-PS2 converter because USB support wasn't good enough.

Maybe a hint?

Posted Jan 19, 2026 10:27 UTC (Mon) by taladar (subscriber, #68407) [Link] (1 responses)

Devfs didn't last very long though before it was replaced by udev.

Maybe a hint?

Posted Jan 19, 2026 19:15 UTC (Mon) by Cyberax (✭ supporter ✭, #52523) [Link]

And eventually got replaced back by the new devfs :)

I actually used devfs way into 2010 on my own device (a RouterBoard) by porting the devfs. It was easier with a minimal userspace than udev.

Maybe a hint?

Posted Jan 15, 2026 20:32 UTC (Thu) by Cyberax (✭ supporter ✭, #52523) [Link]

The low level of GTK3 was mostly fine. It was the overall UI/UX experience of GNOME3 that went off the rails.

They threw away literally EVERYTHING and had this "dynamic activities" model: https://youtu.be/S0lIpCntwv8?t=309

Why do some want to get rid of GTK2 and all applications that depend on it?

Posted Jan 17, 2026 19:42 UTC (Sat) by anton (subscriber, #25547) [Link] (6 responses)

Touch input: None of my devices have touch input. And why would adding touch input cause a backwards-incompatible change?

Multi-DPI setups: That certainly existed in 2002. In any case, why would it cause a backwards-incompatible change?

HiDPI: Why would that cause a backwards-incompatible change?

Plug and play was a buzzword in the 1990s, which PCI clearly having it and ISA gaining PnP features. By 2002 PnP was universally available. I don't see that this has any relevance GTK. Maybe you are thinking of hotplugging, which has been available since at least USB (introduced 1996, and also widely available in 2002). Why would that cause a backwards-incompatible change?

Compositing: I don't know what that's about, but why would it cause a backwards-incompatible change?

2D acceleration became important with Windows in the 1990s, and in 1994 or so the 2D-accelerated S3 928 was the thing to have, and in 1995 the Matrox Millenium was introduced and was the next thing to have (and I bought it). By the end of the 1990s, 3D acceleration became available, and I bought a Voodoo 3 3000 in 1998. By 2002, 3D acceleration was widely available. In any case, why would 2D or 3D acceleration cause a backwards-incompatible change to GTK?

Plugging in a mouse with a serial interface does not require rebooting. Plugging or unplugging a PS/2 mouse is supposedly safer if you power down the computer (we did some live unplugging and plugging and never had a hardware failure from that, though). Plugging or unplugging a USB mouse does not require rebooting. USB has available since 1996.

And 32-bit colours, which you did not mention were also available in 2002 (already the Matrox Millenium supported that).

What are reasons for backwards-incompatible changes?

  • You need to support something that does not fit in the existing interface, and where you cannot provide the existing interface in addition to the new one. Did this happen with GTK3 and GTK4? One would hope that they learn from that and make the interfaces sufficiently flexible that this does not happen again. One would hope that they have learned that lesson by GTK2.

    Contrasting example: The Linux kernel was released in 1991, when many of the hardware features you mentioned really were not there yet, yet there is no backwards-incomatible Linux2, Linux3, or Linux4 (admittedly, the removal of a.out in Linux 5.1 eliminated backwards compatibility with binaries created up to 1998).

  • You have a cool new idea for the interface to your clients, that you want to pursue, and you don't want to do the footwork to integrate it in a backwards-compatible way. From the descriptions I have read here, that's what happened with GTK3, and again with GTK4. That's ok, but then you need to be aware that you have now created an additional maintenance cost to someone: Either someone maintains the old and all the cool-new-idea interfaces, or the clients of your library will need to be rewritten for one of your cool new interfaces at some point, or these clients will become unusable (a cost to the users of these applications).
That being said, what is broken with GTK2 that some people want to eradicate it and all the applications that depend on it?

Why do some want to get rid of GTK2 and all applications that depend on it?

Posted Jan 17, 2026 20:08 UTC (Sat) by pizza (subscriber, #46) [Link] (5 responses)

> Touch input: None of my devices have touch input. And why would adding touch input cause a backwards-incompatible change?

You are in a tiny, tiny, tiny minority.

But it's not that "touch input" is backwards-incompatible in of itself, but rather that touch-oriented UIs need to be designed differently to be effective.

> Multi-DPI setups: That certainly existed in 2002. In any case, why would it cause a backwards-incompatible change?

No, the entire system ran at a single DPI. And $deity help you if you tried to change it from 96dpi.

> HiDPI: Why would that cause a backwards-incompatible change?

Because traditionally, the goal of higher resolution was to cram more onto the screen (making things smaller for a given screen size), whereas "HiDPI" is about keeping visual elements the same perceptual size on different resolution (but identically sized) screens.

The reason it's not "backwards compatible" is that this requires your UI layouts to be specified in resolution-independent units (as opposed to, say, "pixels").

> That being said, what is broken with GTK2 that some people want to eradicate it and all the applications that depend on it?

Other than the minor detail that many(most?) GTK2 applications directly rely on X11-isms.. like having a fixed dpi for all UI elements, there's the more fundmental problem about how GTK2 has been unmaintained for over five years. It turns out that folks complaining aren't stepping up to maintain anything.

Why do some want to get rid of GTK2 and all applications that depend on it?

Posted Jan 17, 2026 20:38 UTC (Sat) by Wol (subscriber, #4433) [Link]

> But it's not that "touch input" is backwards-incompatible in of itself, but rather that touch-oriented UIs need to be designed differently to be effective.

And that many devices are now "touch only". Personally, I think that's awful, and my windows computers are configured to disable touch if a mouse is present (which it almost always is). I wish I knew how to do it on linux.

But at the end of the day, we now live in a world where "touch only" is the norm :-(

Cheers,
Wol

Why do some want to get rid of GTK2 and all applications that depend on it?

Posted Jan 17, 2026 23:06 UTC (Sat) by anton (subscriber, #25547) [Link] (3 responses)

You are in a tiny, tiny, tiny minority.
As demonstrated by the resounding success of Windows 8 (pre 8.1) designed for touch, right?
But it's not that "touch input" is backwards-incompatible in of itself, but rather that touch-oriented UIs need to be designed differently to be effective.
We have a lot of applications that are designed for being effective with a mouse, and are not going to be redesigned. They work for those who use a mouse. Why is there the drive to get rid of them?
No, the entire system ran at a single DPI. And $deity help you if you tried to change it from 96dpi.
X11 fonts (bitmap fonts, not faces) come in 75dpi and 100dpi sizes (and that has already been so around 1990, i.e., long before GTK2), and I can tell every application instance which font it should use. And I certainly had 107 dpi on my laptop screen (1024x768 on a 12" screen) and ~10 dpi on the 120" screen that the projector displayed on. I did not need $deity, it all worked fine.
"HiDPI" is about keeping visual elements the same perceptual size on different resolution (but identically sized) screens.

The reason it's not "backwards compatible" is that this requires your UI layouts to be specified in resolution-independent units (as opposed to, say, "pixels").

Such units may be a good idead if you have a clean slate, but if you have a legacy of applications to support, then the idea of applying a scale factor to UI elements looks better to me (and I have seen options for setting such scale factors in various GUIs).

Of course the clean slate looks very desirable to programmers compared to the mess of dealing with backwards compatibility, but for a library it means abandoning you client base, and making it clear to all prospective clients that they better steer clear of your library.

Experience tells us that GTK4 and Wayland will be abandoned when the next shiny cool idea comes around.

there's the more fundmental problem about how GTK2 has been unmaintained for over five years.
And the problem with that is what? I use lots of software that has been unmaintained for over five years.

Why do some want to get rid of GTK2 and all applications that depend on it?

Posted Jan 18, 2026 4:20 UTC (Sun) by pizza (subscriber, #46) [Link] (2 responses)

> As demonstrated by the resounding success of Windows 8 (pre 8.1) designed for touch, right?

We're up to Windows 11 now; the overwhelming majority of the devices sold with it have touch screens.

> We have a lot of applications that are designed for being effective with a mouse, and are not going to be redesigned. They work for those who use a mouse. Why is there the drive to get rid of them?

You mean besides the fact that the overwhelming majority of devices sold for the past decade lack any other input mechanism other than a touchscreen, and even if you plug a physical keyboard+mouse into one, they can't run those applications anyway?

> X11 fonts (bitmap fonts, not faces) come in 75dpi and 100dpi sizes (and that has already been so around 1990, i.e., long before GTK2), and I can tell every application instance which font it should use. And I certainly had 107 dpi on my laptop screen (1024x768 on a 12" screen) and ~10 dpi on the 120" screen that the projector displayed on. I did not need $deity, it all worked fine.

Your projector and laptop screen both claimed to be 96dpi as far as X and all of its applications are concerned. Change that at your own peril; nearly every X11-native application will break because non-font elements will not scale, resulting in text that either overflows or is truncated by its bounding box. (most notably in menu bars, single-line input forms and labels). Ironically the applications that don't horribly break bypass X11 font rendering (along with most other X11 primitives) entirely, relying instead on client-side rendering and just slinging the resultant pixmaps to the X server. (In other words, the same paradigm that Wayland is built around)

> Such units may be a good idead if you have a clean slate, but if you have a legacy of applications to support, then the idea of applying a scale factor to UI elements looks better to me (and I have seen options for setting such scale factors in various GUIs).

Congratulations, you just answered your own question about why toolkit APIs needed to be restructured. And again, there's not currently a way to have X11 apply a blanket scale factor on an application-by-application (much less element-by-element) basis.. because dpi is a global, immutable attribute.

> Experience tells us that GTK4 and Wayland will be abandoned when the next shiny cool idea comes around.

Experience tells us that GTK4 will be directly supported for at least a decade, and closer to two. Meanwhile, while Wayland won't last forever, it will remain relevant for at least the next two decades due to numerous long-support-lifecycle industries (eg automotive) basing their software stacks around it.

Why do some want to get rid of GTK2 and all applications that depend on it?

Posted Jan 19, 2026 4:37 UTC (Mon) by jcelerier (guest, #181931) [Link] (1 responses)

> Your projector and laptop screen both claimed to be 96dpi as far as X and all of its applications are concerned. Change that at your own peril; nearly every X11-native application will break because non-font elements will not scale, resulting in text that either overflows or is truncated by its bounding box. (most notably in menu bars, single-line input forms and labels). Ironically the applications that don't horribly break bypass X11 font rendering (along with most other X11 primitives) entirely, relying instead on client-side rendering and just slinging the resultant pixmaps to the X server. (In other words, the same paradigm that Wayland is built around)

I've been using Xft.dpi to adjust DPI since 2014 and it's always worked for me, even for apps that AFAIK end up calling raw X11 drawing primitives (for instance PureData which uses TCL/Tk). Do you have example of broken apps?

Why do some want to get rid of GTK2 and all applications that depend on it?

Posted Jan 19, 2026 12:47 UTC (Mon) by pizza (subscriber, #46) [Link]

> I've been using Xft.dpi to adjust DPI since 2014 and it's always worked for me,

Xft.dpi is purely for font scaling, it leaves the X server's core dpi setting unchanged. It also only affects truetype (==scaleable) font rendering, not "classic" X11 fonts which are fixed bitmaps.

See: https://unix.stackexchange.com/questions/596765/is-x-dpi-...

Maybe a hint?

Posted Jan 15, 2026 9:37 UTC (Thu) by kleptog (subscriber, #1183) [Link] (2 responses)

> At least at my place of work, GTK2 is used for a lot of internal applications that nowadays would be written with web interfaces. Many of these applications are "finished", so there is little appetite to spend effort upgrading toolkits.

Honestly, these kinds of projects are better served by just creating a Dockerfile that builds the project in the last Debian release where it worked and just shipping that. Then you have compatibility until the kernel interfaces change which is a lot longer. For internal applications security updates can only break things.

In theory, people could migrate all those old applications to something like Flatpak distribution, then the problem goes away.

Maybe a hint?

Posted Jan 15, 2026 16:05 UTC (Thu) by LtWorf (subscriber, #124958) [Link] (1 responses)

That only works until your wayland compositor remains X11 compatible, which is probably much shorter.

Also the kernel breaking compatibility does happen.

Maybe a hint?

Posted Jan 15, 2026 17:32 UTC (Thu) by Tarnyko (subscriber, #90061) [Link]

Fair statement, but I doubt Xwayland is leaving anytime soon.

As was already said, such an app would benefit from a proper .AppImage version; it should work everywhere if done correctly.

Maybe a hint?

Posted Jan 14, 2026 18:00 UTC (Wed) by LionsPhil (subscriber, #121073) [Link] (1 responses)

GTK3 and onward made a fairly significant design pivots in places IIRC. To the project's credit, they have documentation covering both migrations, but you can see it is not a "bump the version number and swat at any deprecation warnings" kind of weekend task if your application did anything nontrivial:

https://docs.gtk.org/gtk3/migrating-2to3.html
https://docs.gtk.org/gtk4/migrating-3to4.html

...along with that, it's more or less impossible *not* to change the UX in the process. Not just from theme breakage (that ship's largely sailed, unfortunately), but from UI patterns (like how the modern GTK/GNOME world likes putting primary buttons in titlebars, and will start doing so if you use the stock file chooser).

Maybe a hint?

Posted Jan 17, 2026 1:20 UTC (Sat) by mirabilos (subscriber, #84359) [Link]

My window manager does not even *have* title bars…

… and Gtk+3 programs are so ugly, compared to even Gtk+2, which I already was no fan of.

Maybe a hint?

Posted Jan 14, 2026 17:00 UTC (Wed) by smoogen (subscriber, #97) [Link] (25 responses)

It always seems to me that it takes a lot longer to port most gtk apps than it does with qt applications. I don't know why and don't know if there has been any sort of serious study of that.

Maybe a hint?

Posted Jan 14, 2026 19:11 UTC (Wed) by mussell (subscriber, #170320) [Link] (8 responses)

Distros don't package old versions of Qt since it's too tightly interconnected, so any Qt application needs to either use the latest Qt or risk being removed from distros. The Gtk/GObject system is less tightly connected and more orthogonal, Gtk 2 can be built against a recent glib for example. That's why distros are/were still continuing to package Gtk 2 for the last 20 or so years, but only one of the four major releases of Qt released in that same time span, and if the distros are/were still going to package Gtk 2, then there's no pressure for applications to switch.

Gtk 2 applications are now just getting the same treatment as Qt applications, upgrade or Debian won't package you.

Maybe a hint?

Posted Jan 14, 2026 20:25 UTC (Wed) by AdamW (subscriber, #48457) [Link]

"Distros don't package old versions of Qt since it's too tightly interconnected"

Sure we do. Fedora still has Qt 5.

Maybe a hint?

Posted Jan 14, 2026 20:54 UTC (Wed) by tux3 (subscriber, #101245) [Link] (5 responses)

It's probably also the sheer amount of stuff in Qt/KDE. I wouldn't be surprised if there's an order of magnitude more code in the Qt/KDE frameworks than in GTK (counting the libraries, not the respective desktop environments). Shipping old libraries is all fine as long as upstream is active or when it's only a small maintenance burden.

On the other hand, Qt upgrades have been relatively painless in my experience. They don't tend to make major redesigns that force you to significantly change your UI or rewrite parts of your app. QML/Qt Quick is 15 years old and that's what they recommend for new projects. But you can still port and build your old QtWidgets app with new frameworks, that's also fine.

If Debian only ships Qt5 and Qt6, it's not like applications should really feel trapped on Qt4. The migration is tedious, but for most apps it's mostly mechanical work. It doesn't that all that much pressure in the first place.
Of course if you look at complex projects like GIMP or Krita the migration will be much more difficult. But even in more complex applications, the trend seems to be borne out. (I expect Krita on Qt6 long before GIMP on GTK4, despite both toolkits releasing in the same month.)

Maybe a hint?

Posted Jan 14, 2026 21:48 UTC (Wed) by mb (subscriber, #50428) [Link] (3 responses)

>but for most apps it's mostly mechanical work

It's an absolute horror for any Python based Qt5 application.

Maybe a hint?

Posted Jan 15, 2026 12:32 UTC (Thu) by swilmet (subscriber, #98424) [Link] (2 responses)

It's because of Python being an interpreted language, it's not Qt's fault.

Maybe a hint?

Posted Jan 15, 2026 15:56 UTC (Thu) by mb (subscriber, #50428) [Link] (1 responses)

Well, I was neither talking about whose "fault" this was, nor am I interested in determining this.

But Qt massively changed the API (mainly all enums) and Python did change nothing.

It's a fact that porting Qt5 Python apps to Qt6 is so hard to the point where I seriously consider not doing it and deprecating or rewriting the applications instead (or leave it to somebody else).
It's far from being just "mechanical work".

Maybe a hint?

Posted Jan 15, 2026 21:11 UTC (Thu) by swilmet (subscriber, #98424) [Link]

In that case a good suggestion that I have, but that is not always applicable (it depends on the architecture of the app), is to first port the "leaves" classes and utility functions (and test those independently of the whole app). "Leaves" as in depending on nothing else from the same project.

Then walk up the "tree" (it's actually a DAG, or almost a DAG) of dependencies to gradually port the project, piece by piece, with tests (either unit tests or interactive tests as mini apps). Until the main function which is the top of the DAG.

But not all projects have this kind of architecture. If a class towards the leaves depends on a class towards the top, you're out of luck (it can lead to a spaghetti architecture, in that case).

BTW GTK has those wonderful functions such as gtk_widget_get_parent() (returns the container) and g_application_get_default() (returns the singleton object containing the whole app). These must be used with great care, because it can easily lead to a spaghetti architecture (it's just a downcast away from being able to access every other class of the app). So some discipline is needed.

Maybe a hint?

Posted Jan 15, 2026 9:30 UTC (Thu) by taladar (subscriber, #68407) [Link]

Qt4 to Qt5 basically abandoned widgets (which are still there but basically unmaintained in Qt5) in favor of QML.

Maybe a hint?

Posted Jan 14, 2026 21:02 UTC (Wed) by plugwash (subscriber, #29694) [Link]

> Distros don't package old versions of Qt since it's too tightly interconnected

I can't speak for other distros, but Debian has packaged multiple versions of QT for most of the last decade.

* 2013 - QT5 added.
* 2020 - QT4 removed.
* 2022 - QT6 added

Maybe a hint?

Posted Jan 14, 2026 19:19 UTC (Wed) by ebee_matteo (guest, #165284) [Link] (15 responses)

Qt+ has a large commercial backing, e.g. mobile, automotive, embedded, etc.

These are paying customers that do not want to see backwards incompatibilities unless strictly necessary.

Gtk+ is community led. It has some backing from one big "customer" (IBM), but apart from it, it doesn't have the same constraints.

Maybe a hint?

Posted Jan 14, 2026 21:07 UTC (Wed) by pizza (subscriber, #46) [Link]

> Gtk+ is community led. It has some backing from one big "customer" (IBM), but apart from it, it doesn't have the same constraints.

In this context, "resources" would be a better word to use than "constraints".

At any point over the past 20 or so years, I'd bet there were about two orders of magnitude of folks being paid to work on Qt than on GTK. Heck, Qt alone probably had more developers than the entire GTK ecosystem (ie GNOME + applications) put together.

(FWIW, I've found Qt quite pleasant to use, and I say that as someone who generally loathes C++)

Maybe a hint?

Posted Jan 17, 2026 1:21 UTC (Sat) by mirabilos (subscriber, #84359) [Link] (13 responses)

It’s GNOME-led, not really community.

Maybe a hint?

Posted Jan 17, 2026 2:53 UTC (Sat) by pizza (subscriber, #46) [Link] (12 responses)

> It’s GNOME-led, not really community.

So what else do you call "a loose collection of mostly volunteers working together to create something"?

Maybe a hint?

Posted Jan 17, 2026 13:12 UTC (Sat) by Wol (subscriber, #4433) [Link] (11 responses)

"Community led" implies a GTK+ community, doing what's best for GTK.

If it's the Gnome community doing what's best for Gnome (as appears to be the case), then I wouldn't call GTK+ a community-led project.

Don't forget, the G in GTK stands for Gimp, not Gnome, so we already have a project that's had its founders left behind in a hi-jack ...

(Don't get me wrong, that's the way FLOSS works and it's good, but GTK probably isn't even an independent project!)

Cheers,
Wol

Maybe a hint?

Posted Jan 17, 2026 13:20 UTC (Sat) by pizza (subscriber, #46) [Link] (10 responses)

> "Community led" implies a GTK+ community, doing what's best for GTK.
> If it's the Gnome community doing what's best for Gnome (as appears to be the case), then I wouldn't call GTK+ a community-led project.

So GTK isn't a "community project" because it's being run by... a different community?

Come on.

Maybe a hint?

Posted Jan 17, 2026 18:47 UTC (Sat) by Wol (subscriber, #4433) [Link]

No. It's not "community led" because it's not being led by a team doing what's best for GTK.

It's like saying a village is in charge of its own destiny, because it votes a councillor onto the local Rural District Council.

Simply put, is your own destiny in your own hands. As far as GTK is concerned, it isn't. The people working on it do what's best for Gnome.

Cheers,
Wol

Maybe a hint?

Posted Jan 17, 2026 19:21 UTC (Sat) by Wol (subscriber, #4433) [Link] (8 responses)

Re-reading your comment, I think it's better to say "GTK isn't community-led because there isn't a GTK community".

Cheers,
Wol

Maybe a hint?

Posted Jan 17, 2026 19:50 UTC (Sat) by pizza (subscriber, #46) [Link] (7 responses)

> Re-reading your comment, I think it's better to say "GTK isn't community-led because there isn't a GTK community".

That's better, but still succumbs to the No True Scotsman fallacy.

What is "the gtk community" if not the folks that actually develop and use it? Should GNOME not take an interest in its upstreams?

GTK,is a "community project" as opposed to a "commercial/corporate project" ala QT. Or a "personal project", for that matter. The fact that GTK development has been being driven mostly by GNOME (which is itself another "community project") doesn't invalidate that basic fact.w\

Maybe a hint?

Posted Jan 17, 2026 20:33 UTC (Sat) by Wol (subscriber, #4433) [Link] (6 responses)

> Should GNOME not take an interest in its upstreams?

IS GTK a Gnome upstream? Or is it just a Gnome component?

It started out as a Gimp component. Then it got taken over as a Gnome component (as evidenced by the fact that Gimp stayed on "legacy" GTK long after GTK had moved on with Gnome). Has it now struck out on its own?

Cheers,
Wol

Maybe a hint?

Posted Jan 18, 2026 13:24 UTC (Sun) by pizza (subscriber, #46) [Link] (5 responses)

> IS GTK a Gnome upstream? Or is it just a Gnome component?

That's a distinction without a difference, as GTK is indisputably maintained by "its community"

Maybe a hint?

Posted Jan 18, 2026 19:19 UTC (Sun) by mirabilos (subscriber, #84359) [Link] (4 responses)

It very much is NOT maintained by the Gtk+ community, it’s maintained by GNOME, who have hostile-taken-over Gtk+.

Maybe a hint?

Posted Jan 18, 2026 19:36 UTC (Sun) by Wol (subscriber, #4433) [Link]

> who have hostile-taken-over Gtk+.

I think you mean they've forked it (which is perfectly okay). But yes, they have also hijacked the name, which is less okay ...

Cheers,
Wol

Maybe a hint?

Posted Jan 18, 2026 22:28 UTC (Sun) by pizza (subscriber, #46) [Link] (2 responses)

> It very much is NOT maintained by the Gtk+ community, it’s maintained by GNOME, who have hostile-taken-over Gtk+.

Please, provide *any* citation that GTK was subject to a hostile takeover -- ie against the wishes of the folks that were nominally in charge and maintaining it at the time. Even if what you say is true, and GTK was hostile-forked 2.5 decades ago (because that's the timeframe we're talking about here) what effing difference does it make in practical terms, when the alternative is... being effectively (if not actually) unmaintained? (Let's not pretend that GIMP has been swimming in developers all these years)

Because all you're doing is maligning the folks that have been maintaining one of the most commonly used (*nix) UI toolkits, for reasons that appear to be nothing more than "everything GNOME does is automatically doubleplus BAD".

Maybe a hint?

Posted Jan 18, 2026 22:44 UTC (Sun) by mirabilos (subscriber, #84359) [Link] (1 responses)

Lack of active maintenance is neither an indicator of obsolescence nor a reason to do a hostile takeover.

Maybe a hint?

Posted Jan 19, 2026 0:12 UTC (Mon) by pizza (subscriber, #46) [Link]

Again, I repeat myself:

Please, provide *any* citation that GTK was subject to a hostile takeover.

Maybe a hint?

Posted Jan 14, 2026 19:24 UTC (Wed) by pbonzini (subscriber, #60935) [Link] (1 responses)

> But if gtk3 was released in 2011 and Gimp itself, where all of this originated from, only adopted it 14 years later,

GIMP switched earlier than that (2.99.2 was released in 2020—see https://www.gimp.org/news/2020/11/06/gimp-2-99-2-released/), it's the stabilization of 3.0 that took ages and not specifically the transition to GTK3. And GIMP probably has a more complex UI than most of the affected packages, as well as special input requirements.

Maybe a hint?

Posted Jan 16, 2026 13:33 UTC (Fri) by wtarreau (subscriber, #51152) [Link]

Interesting. But this might have a (at least partial) relation with a difficulty to port to the new version, since as you say, it definitely has a more complex UI than many software and could have unveiled a number of less covered areas or regressions as well.

Maybe a hint?

Posted Jan 14, 2026 21:33 UTC (Wed) by NHO (subscriber, #104320) [Link] (2 responses)

I am not a very good programmer, with no GTK experience, but I tried to port a GTK 3 program, pqiv image viewer, (well, it has compile time chooser for GTK 2 and GTK 3) to GTK 4.

I dropped it after spending four hours trying to find how to implement a file filter, so it won't try to open unsupported files and dispatch right formats to right backends. The filtering functionality was moved to completely different object with no comment in old place.

Documentation is horrendous and nigh-useless. It's just a bunch of functions with description of "what they do" but no examples of "how to do something". It presumes that people are already well-versed in GTK programming. And I have no idea if I implemented this correctly because, well, there's a lot of X11 code that needs to be dealt with and my enthusiasm died.

Maybe a hint?

Posted Jan 15, 2026 12:44 UTC (Thu) by swilmet (subscriber, #98424) [Link] (1 responses)

The GTK learning curve is quite steep. But if you know GObject well, learning GTK is easier. The GTK official tutorial don't teach you GObject though, newcomers need to figure out by themselves that they actually need (to learn) GObject.

I tried during the GTK 3 era to write a book:
https://github.com/gdev-technology/glib-gtk-learning

Nowadays I would like to finish and update the book, but it would be mostly unpaid work, so I prioritize other tasks…

Maybe a hint?

Posted Jan 16, 2026 19:05 UTC (Fri) by Tarnyko (subscriber, #90061) [Link]

Thanks, this is very interesting.
Although I am a sort of Glib/GTK+ "veteran", I am curious about how you phrased and organized the book.
I just grabbed it, the beginning looks promising.

Maybe a hint?

Posted Jan 14, 2026 23:36 UTC (Wed) by ebassi (subscriber, #54855) [Link]

> But if gtk3 was released in 2011 and Gimp itself, where all of this originated from, only adopted it 14 years later,

There was a branch of GIMP using GTK 3 API before GTK 3.0 was released. The GIMP team prioritised moving the internals of the application to a whole new engine, and getting it right, before focusing on the UI side of things. They also had their lives complicated by the fact that GIMP has a ton of Python plugins, which required porting from Python 2 to Python 3, and from static GTK bindings to dynamic ones built on introspection.

> So actually the problem might be the newer versions themselves if programs don't adopt them for that long.

Or that there are a lot of GTK-based applications that are write-once, and their maintenance is on a best effort basis; once they are written, nothing else except small bug fixing is of import, and those applications are supposed to run on systems that are equivalent to the ones on which they were developed. In 2011, those systems were: Linux, X11, non-HIDPI.

The fact that somebody is using a GTK2 application does not imply that the application is still being actively maintained, either; but, from a Debian perspective, they are exactly the same.

Maybe a hint?

Posted Jan 15, 2026 13:09 UTC (Thu) by swilmet (subscriber, #98424) [Link]

> So actually the problem might be the newer versions themselves if programs don't adopt them for that long.

As the maintainer of several GTK 3 apps (including gedit), this is definitely true. GTK 4 has removed features without replacements. So I either need to re-implement some lower-level stuff, or I need to change the UI/UX.

For one app (Enter TeX), I don't want to change the classic menubar/toolbars UI, and with GTK 4 a classic menubar has less features. So for that app I'm stuck with GTK 3 (or I would need to create my own menubar widget, a bit silly if you ask me).

Anyway, most of the funding I received is from the Microsoft Store where GTK is bundled with the app, so I can still depend on GTK 3 "forever". (Similar to Ardour depending on GTK 2).

The usual situation

Posted Jan 15, 2026 4:04 UTC (Thu) by pabs (subscriber, #43278) [Link] (4 responses)

This situation of software being broken by newer versions of dependencies comes up a lot in Debian (and presumably every distro) over the years, the usual result is that the relevant packages just get removed, usually without any equivalent or better replacement and everyone just has to deal with it. Personally I just have hundreds of non-updatable libraries and packages still installed, plus a text file with a list of packages that I was forced to remove with no replacement. Sometimes there are adequate alternatives, or forks that continue maintenance of the obsolete projects, but they almost always stay out of distros.

The usual situation

Posted Jan 15, 2026 4:11 UTC (Thu) by dskoll (subscriber, #1630) [Link] (3 responses)

Yep. I run a few mailing lists on Mailman 2. That's unmaintained now in favour of Mailman 3, but I don't like Mailman 3. It's way more complicated and I don't want to invest the time needed to migrate my lists and check that everything's OK. Also, I run the Sendmail MTA (for legacy reasons...) and the Mailman 3 documentation states: "The core Mailman developers generally do not use Sendmail, so experience is limited. Any and all contributions are welcome! There is one such contribution on GitHub and another in comments at this issue."

Not too reassuring.

So for now, I install Mailman 2.x from source. But it requires Python 2.x and I'm dreading the day Debian drops that. 🙁

The usual situation

Posted Jan 15, 2026 4:34 UTC (Thu) by pabs (subscriber, #43278) [Link]

Mailman 2 has a maintained fork with Python 3 support btw:

https://github.com/jaredmauch/mailman2-python3

Unlikely any distro will allow it to be packaged though.

The usual situation

Posted Jan 15, 2026 4:37 UTC (Thu) by pabs (subscriber, #43278) [Link] (1 responses)

BTW, I started a project to archive all Mailman 2 mailing list archives, are your lists on the wiki page yet?

https://wiki.archiveteam.org/index.php/Mailman2

The usual situation

Posted Jan 15, 2026 15:29 UTC (Thu) by dskoll (subscriber, #1630) [Link]

Yes; my list info is added for one site, but not another. I will add the second site.

grpn

Posted Jan 15, 2026 9:56 UTC (Thu) by geert (subscriber, #98403) [Link]

I tried "sudo apt remove libgtk2.0-0t64:amd64" to find which important programs I am using, and it came up with "grpn".
Hence I opened an issue https://github.com/utopiabound/grpn/issues/22

However, "apt search rpn" just taught me about "orpie", which is terminal-based, and thus doesn't suffer from the Downfall of the X11 Civilization, great! ;-)

So like Linux

Posted Jan 18, 2026 6:19 UTC (Sun) by jmalcolm (subscriber, #8876) [Link] (1 responses)

I feel this highlights a key difference between Linux and other environments and especially the attitude towards ABI stability.

On the one hand, it is very reasonable to remove GTK2. It is ancient and other packages distributed as part of Debian are not using it.

But are we really saying that the only software that matters is the software distributed as part of the OS? The idea that the operating system and the software that runs on it is a single collection of curated software is very much a Linux innovation. I do not remember other operating systems with package managers and this kind of coupling.

On Windows, we would expect to have little knowledge of the software that exists out in the wild. If Windows cannot run Windows software, the operating system gets blamed. And so ABI stability and backwards compatibility are a big thing on Windows.

In concept, Open Source software lives forever. And as long as there are users and developers interested in the software, it continues to evolve. As such, it is not such a big deal to adapt it to things like UI frameworks as things change. It is hopefully a small amount of the work (though perhaps GIMP would disagree).

And commercial software for sale could work similarly. But a lot of proprietary software, perhaps most of it, does not work like that. Software written by businesses for their own use is often created in a dedicated project. Once the software is "complete", the team may disband. The software may never see significant updates. If the software does not need new features, there may be nobody (and no budget) to do a UI rewrite 5 years later. If the GUI libraries disappear, the program will break.

On Windows, the operating system only ships with the basic layers of UI support. A Windows application using MFC libraries will likely ship those DLLs itself. This means that all Windows has to do to keep hosting this application for decades is to keep Win32 stable. On Linux, we expect the distro to provide almost every dependency we need. And then we expect the distro to continue supporting these over long periods.

A solution to this problem could easily be OCI containers. Debian 14 may not include GTK2, but Debian 13 still does. And since containers only rely on the much more stable Linux kernel ABI, I predict that a Debian 13 container will still run just fine on Debian 16 and probably Debian 20.

And some distros offer extremely long support periods. RHEL offers 10 years of support. Ubuntu now offers 15! If you pick an Ubuntu LTS as the base for your enterprise container, you can get this length of support on any distro. And the container will continue to work even after this support ends.

So not every distro needs to maintain everything forever. One of the nice things about Linux I think is that we can eventually leave some of the cruft behind.

So like Linux

Posted Jan 19, 2026 14:39 UTC (Mon) by jond (subscriber, #37669) [Link]

> It is ancient and other packages distributed as part of Debian are not using it.

150 packages in Debian are using it, and will be forcibly removed as well.

> On Windows, the operating system only ships with the basic layers of UI support. A Windows application using MFC libraries will likely ship those DLLs itself. This means that all Windows has to do to keep hosting this application for decades is to keep Win32 stable. On Linux, we expect the distro to provide almost every dependency we need. And then we expect the distro to continue supporting these over long periods.

I think windows, by analogy, is useful -- I've considered gtk2 to be analaguous to win32 -- but your comparison here is more subtle: perhaps xlib would be the win32 comparator and gtk2 to one of the MFC libraries. I am not familiar enough with Windows to know what their support story has been for MFC.

> A solution to this problem could easily be OCI containers.

Or some other container-like solution such as Flatpaks,

> And some distros offer extremely long support periods. RHEL offers 10 years of support. Ubuntu now offers 15! If you pick an Ubuntu LTS as the base for your enterprise container, you can get this length of support on any distro. And the container will continue to work even after this support ends.

Ultimately *someone* has to do the supporting. But we could perhaps consolidate and have fewer distributions repeating the same work. The work involved maintaining gtk2 into the future is hard to predict, but someone else did an analysis of security issues (just one of the maintenance issues of course) and it was relatively low. (Contrast with QT4 where part of the problem is the API is much larger and IIRC embeds a browser).

I think this raises really interesting questions about what a distribution should be providing nowadays. I personally think we are still far too package-centric. 25 years ago, repackaging and distributing software was a much more valuable job than it is today, where (with e.g. the js ecosystem) it's akin to boiling the ocean. I feel the value proposition of distros is elsewhere: in particular, in their values and their communities.

GNOME policy pushed into GTK3 and newer

Posted Jan 25, 2026 14:40 UTC (Sun) by N0NB (guest, #3407) [Link] (6 responses)

Some years back I took over maintenance of a favorite niche application (amateur radio). It is a C program written against GTK2. Its UI is relatively simple, mainly one screen with a handful of popup windows. I naively set about porting it to GTK3 and learned a number of things.

The most obvious one I learned is that GTK3 was no longer a general purpose GUI toolkit as elements such as icons in menus were deprecated and difficult to retain in the app's UI. The reason I wanted to keep icons in the menus is that the app, while being multi-lingual with GNU gettext, was obviously used in many parts of the world and retaining the UX of the GTK2 based version would ease the transition for users that didn't have a native language version. Apparently GNOME policy is to not have icons in drop down menus which is fine but should that decision have been pushed down into GTK3? If the intention was for GTK3 and newer to only be used to write GNOME applications, then fine, but it seems as though GTK3/4 are still being presented as general purpose UI toolkits. I'm just asking for honesty regarding the policy layer.

Apparently the removal of icons from menus is not a universal trend in UI design as the Qt apps I use still have them.

As a user of several amateur radio applications packaged by Debian, I am a bit concerned for the future. The main logging program I use, CQRlog, is GTK2 based and I don't see any indication that upstream is going to change. An open bug on the CQRlog GitHub issue tracker has been present since mid 2019 with no comment.

In an ideal world GTK3 would be forked and restored to being a general purpose toolkit by reverting GNOME policy decisions that have been pushed into the toolkit. Of course, I'm unable to do anything of the sort primarily due to limitations of skill and experience.

GNOME policy pushed into GTK3 and newer

Posted Jan 25, 2026 22:38 UTC (Sun) by pizza (subscriber, #46) [Link]

> Some years back I took over maintenance of a favorite niche application (amateur radio). It is a C program written against GTK2. Its UI is relatively simple, mainly one screen with a handful of popup windows. I naively set about porting it to GTK3 and learned a number of things.

I'm in a similar boat; I've effectively inherited a GIMP 2.x plugin that relies on a pile of custom GTK2 widgets.

Unfortunately it can't be built against GIMP 3.x without first rewriting its entire UI library to use GTK3, which is far more work than I can justify. And that's _before_ the other GIMP2->GIMP3 porting can meaningfully begin.

> If the intention was for GTK3 and newer to only be used to write GNOME applications, then fine, but it seems as though GTK3/4 are still being presented as general purpose UI toolkits. I'm just asking for honesty regarding the policy layer.

GNOME has pushed more and more GNOME-specific stuff (ie widgets implementing specific policies and the GNOME vision) into separate libraries. Which get them even more hate mail.

> In an ideal world GTK3 would be forked and restored to being a general purpose toolkit

At the end of the day, GTK (like most F/OSS) is a do-ocracy, so it's no surprise that the work that GNOME puts into it reflects GNOME's priorities. (FWIW, I remember similar complaints were made about GTK 2.0 versus GTK 1.2)

...Ironically, each major release of GTK is more agnostic wrt the underlying platform (not just GNOME but also the OS itself) than its predecessor. It's perfectly possible to recreate a GNOME2-era UI with GTK3/4, but someone has to do that work... and that someone does not exist in this very-non-ideal world.

GNOME policy pushed into GTK3 and newer

Posted Jan 25, 2026 23:03 UTC (Sun) by mbunkus (subscriber, #87248) [Link] (4 responses)

Quite a while ago there was a talk by Dirk Hohndel & Linus Torvalds about how and why they decided to port their divelog application, Subsurface, from GTK to Qt. It's on Youtube. Some of the reasons they pointed out were deprecations & removals, a rather unhelpful/unresponsive community, and a bad cross-platform compatibility story. Some of that mirrors what you wrote about. It was a pretty interesting talk, worth watching.

I recommend you look into porting your application, too. If it's a simple GUI based mostly if not exclusively on standard elements without custom styling requirements, getting it up & running with Qt should be pretty easy (context: I ported MKVToolNix from wxWidgets to Qt quite a while ago and never regretted it). Qt has a much better story of feature removal. They do deprecate & remove stuff, but at a much slower pace with much fewer surprise mass changes. For example, I pretty much did the port from Qt 5 to supporting compilation with both 5 & Qt 6 in a handful of hours on a rainy Sunday. The amount of pre-processor shenanigans required were minimal.

GNOME policy pushed into GTK3 and newer

Posted Jan 31, 2026 13:31 UTC (Sat) by N0NB (guest, #3407) [Link] (3 responses)

Porting to Qt would require that the application be ported to some flavor of C++, no? My understanding is that Qt doesn't have a C interface. If so, it would seem that my choices are to rewrite in something else, thus likely completely changing the flavor of the app, or let it die. It seems things are already well on the way of the path of the latter choice.

GNOME policy pushed into GTK3 and newer

Posted Jan 31, 2026 16:07 UTC (Sat) by mbunkus (subscriber, #87248) [Link] (2 responses)

Well, partially. The parts that directly do things with the UI. However, you can run mixed C/C++ codebases in the same program/project (I've done this for years for… reasons). Or compile your C stuff as a static library that you link in. This is viable if you have big parts of code that doesn't directly do things with the UI, e.g. network stuff, calculations, data shuffling etc.

You can reduce the amount of C++ to write further by using Qt's built-in Qt Quick library with the built-in QML markup language. With these two you can design standard UI things such as main windows, menus, dialogs, all the usual default widgets (buttons, list/tree views, text, text inputs…), event listeners etc. using a pretty simple & easy to understand language. Then call your existing/slightly modified C code from those event listeners or wherever it fits.

I'm not saying it's going to be trivial; you still have to learn a small subset of C++. However, the amount you have to learn is very small compared to the size of the whole language.

If all of this is viable at all I cannot tell, of course, as I don't know your original application. I'm just saying, if you're not ready to let your app go & up for a little adventure, this might be worth taking a look, e.g.

https://doc.qt.io/qt-6/qmlapplications.html

GNOME policy pushed into GTK3 and newer

Posted Feb 2, 2026 9:46 UTC (Mon) by taladar (subscriber, #68407) [Link] (1 responses)

I would argue that a complete change of code architecture like that, even ignoring the requirement to learn C++ and Qt to do it, is probably too much effort for any application that is so low on developer resources that they are still stuck on Gtk 2 at this point.

GNOME policy pushed into GTK3 and newer

Posted Feb 2, 2026 10:34 UTC (Mon) by mbunkus (subscriber, #87248) [Link]

Given the situation that any way to keep alive that application involved a significant amount of work, all I wanted to do is point out a route I've gone down before that is kind of a middle between "rewrite everything" and "keep everything as-is", onto a platform whose track record for stability and compatibility is quite a bit better than GTK's, instead of simply saying "well just give up, then".

I'm not trying to evangelize, just offer more options. Maybe this is way too much work. That's perfectly fine! Maybe they decide to convert to GTK 3 after all. That'd be great! Maybe they'll indeed sunset the program. An absolutely viable and understandable choice!

The elephant in the room with GTK2.

Posted Feb 7, 2026 10:06 UTC (Sat) by athenian200 (guest, #182216) [Link] (2 responses)

I've worked for a very long time on a project that still supports both GTK2 and GTK3 versions, and may well never be able to move to GTK4. I have had to interact long-term with users who are relentless in demanding that we continue to provide a GTK2 version and view even GTK3 with suspicion. I think I've come to a conclusion about why GTK2 is dying so slowly, and also why I think the situation is unlikely to improve.

Most of the people who love GTK2 are those who use a keyboard and a mouse on a PC with no touchscreen. Newer versions of GNOME (and hence GTK because the GNOME team does most of the work on it) are not really improvements on the previous ones for the needs of users who still use computers essentially the same way they did around 2010 or so, and if anything are alienating to those users because they are designed primarily around touchscreen-centric use cases, and possibly the UI expectations of users who grew up with Android and iOS before they ever touched a standard x86-64 PC.

It's essentially the Linux equivalent of the Windows 8 problem, where Microsoft put everything they had into adapting to touchscreens and the mobile ecosystem, neglecting traditional desktop users and expecting them to go along with the program while they attract a different kind of user and grow their brand. This resulted in many Windows users sticking with Windows 7 as long as they could. GTK2 became for a long time the Linux equivalent of Win32 (essentially taking on a niche formerly occupied by Motif in the Unix world), probably much to the chagrin of GNOME's developers and fans of other toolkits, and that situation probably isn't sustainable in the long-term. It also connects with the frustrations of those annoyed by mobile-first (or as I call it, desktop-last) web design.

Now, from the perspective of most open-source developers, working on something like GTK2, or a modern equivalent, isn't interesting because it's seen as a problem that was solved long ago, not a new challenge, and plus there's not a lot of money in it. What most of those users actually want is essentially boring security updates and tooling updates to something that's still working fine for them, not fundamentally new paradigms or a big change. They want essentially the same kind of stability a lot of big enterprises want, only they aren't in a position to pay a lot of money for it.

So there's fundamentally a mismatch between what the average open-source developer is interested in working on and where the money is going on the one hand... compared against what a lot of traditional non-touchscreen desktop users actually want Linux to be. Since most of these users have neither the coding skills nor the money to change the course of things, the river of money and coding skill will likely eventually drag those users kicking and screaming in a direction they loathe, but they will go as slowly as possible and fight like they're cornered, possibly winding up on things like LTS releases of Linux distributions, or forks of forks maintained by a very small number of people.

I think in the end, we'll see all kinds of coping mechanisms. Some projects will bring GTK2 in-tree out of desperation and keep trying to make it work in some form, evolving it in project-specific directions. Some will move to GTK3 and use that into the ground until GTK5, some may switch to different toolkits entirely.

I don't think the problem will get better, because as far as I know, there's just not enough talented developers interested in writing the modern answer to GNOME 2 and GTK for Wayland and providing a path for those who just don't want modern, touch-centric desktop paradigms and instead prefer a traditional experience. I think if something like that existed, people might be willing to put in the work to migrate, but as it stands, they prefer to just stick with the old thing as long as they can, because they don't actually need or want anything newer GTK provides, they're just stuck because security updates and tooling issues are forcing their hands and requiring them to "be practical" and accept something that, from their perspective, is worse in every way other than simply being actively developed and supported. In the end, it's just going to be a lot of very unhappy users, resisting what's new and supported as long as they can, and sooner or later having to struggle to make something work for them that was never designed around their needs.

I genuinely feel bad for everyone involved... the upstreams who just want to get on with developing the future and feel bogged down by people asking for legacy support. The end-users who just want to have cool themes and use their desktop with a keyboard and mouse like always, and don't understand why everything is locked down, limited, less customizable, and designed for touchscreens now. Not to mention the distro maintainers caught in the middle, who know upstream doesn't have their back, they can't maintain these packages themselves, and that the longer they wait to disappoint the users, the more work they take on, and the worse the bitrotting and security issues are going to be. And I honestly see this more as a slow-moving train wreck than as anything that will lead to a happy ending for most people involved.

The elephant in the room with GTK2.

Posted Feb 7, 2026 13:03 UTC (Sat) by pizza (subscriber, #46) [Link]

> Newer versions of GNOME (and hence GTK because the GNOME team does most of the work on it) are not really improvements on the previous ones for the needs of users who still use computers essentially the same way they did around 2010 or so, and if anything are alienating to those users because they are designed primarily around touchscreen-centric use cases

GTK3 and GNOME3 were not designed around touchscreens and predate the stratospheric rise of modern touch-based smartphone UIs. Indeed G3 was explicitly designed for keyboard-centric (if not keyboard-exclusive) use, and works far far better than G2 in that respect.

G3's primary goal was minimizing distractions; acknowledging that one-fullscreen-application-at-a-time was how most folks actually used their systems, so they made multiple dynamic desktops a first class feature to make switching between said applications easier, along with the overview for a quick way to see everything, and navigate between or launch new things.

Other stuff like putting controls/buttons into the title bars were mostly by wide aspect-ratio screens with poor vertical resolution (eg the 1280x720 on relatively large screens that was the unfortunate baseline until pretty recently). UI elements to make touchscreens easier (eg larger buttons, thick window titles, elimination of window decoration, etc) had their origins in accessibility -- larger (and fewer) targets are easier to hit with a mouse when you have poor motor skills (or a touchpad!) and so forth.

> Now, from the perspective of most open-source developers, working on something like GTK2, or a modern equivalent, isn't interesting

It's not a matter of _interesting_ so much as that porting an application from GTK2 to GTK3 requires a ton of _work_. And folks reasonably expect some sort of compensation for that work (of course, "learn new marketable skills" can be one of those rewards, but in that respect it makes more sense to learn a web-centric UI toolkit instead)

Meanwhile, dragging GTK2 into the Wayland era will require substantially rewriting most of it, and dropping what's left. In other words, a process similar to what GTK3 already went through.

(As an aside, GTK3 also predates wayland, so you can't blame Wayland for its design either. But the design of both was influenced by many of the same failings of X11 leading to moving entirely to client-side rendering/decorations, among other things. GTK2 had a lot of X11-isms baked into it, and it was/is common for GTK1/GTK2 applications to have their own direct interactions with the X server. This is one of the things that made GTK1/GTK2 applications such a portability nightmare..)

> What most of those users actually want is essentially boring security updates and tooling updates to something that's still working fine for them, not fundamentally new paradigms or a big change. They want essentially the same kind of stability a lot of big enterprises want, only they aren't in a position to pay a lot of money for it.

I'd correct that last bit to read "only they aren't willing to pay _any_ money for it."

> I think if something like that existed, people might be willing to put in the work to migrate, but as it stands, they prefer to just stick with the old thing as long as they can, because they don't actually need or want anything newer GTK provides

To analogize, their comfortable middle-class lifestyle has been heavily subsidized by other folks' hard work. Now that those other folks have largely moved on (often in a very _final_ sense) these users are finding out just how much work is no longer getting done.

The elephant in the room with GTK2.

Posted Feb 7, 2026 15:38 UTC (Sat) by Wol (subscriber, #4433) [Link]

> and if anything are alienating to those users because they are designed primarily around touchscreen-centric use cases, and possibly the UI expectations of users who grew up with Android and iOS before they ever touched a standard x86-64 PC.

Forget pizza's comment about how GTK2/3/4 weren't primarily designed for touchscreens but ...

Quick question - how do you force shutdown a modern laptop?

My first laptops had two catches underneath - release the catches and the battery fell off!

Then I discovered that, for newer laptops, you had to press and keep the power button down, and after about 5 seconds it would "clonk" off.

Now, when I try that on my - old fashioned screen - work-supplied Dell laptop, I get a big screen that says "pull down here to power off" - on a non-touch screen !?!?!?

Okay, yes I know you simply keep the power button pressed even longer, but seriously? If you're trying to force the laptop off because it's wedged, why do you have that stupid "pull down" screen in the first place !?!?!?

Cheers,
Wol


Copyright © 2026, Eklektix, Inc.
This article may be redistributed under the terms of the Creative Commons CC BY-SA 4.0 license
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds