I think it makes sense to try and cycle the majority of the dependency network to the new versions but I also think it probably makes sense to let the old toolkits live on indefinitely (maybe strategically drop the webkit stuff everything else should be low maintenance) for a variety of reasons, particularly old commercial software.
It is comical in some sense that a Motif app of yore would've been a great "investment" in terms of still functioning now and for the foreseeable future.. sort of like Win32/MFC.
> Motif app of yore would've been a great "investment" ... sort of like Win32/MFC.
I've slowly started getting back to desktop Linux, and honestly Motif (mwm) is fine for my needs. Maybe fvwm would be slightly better.
If I where to do my career over, I'd consider going for C++ and Win32, it seems like a stable performant platform (I get that there's other problems on modern Windows).
oh I'm pretty sure I've seen that a long time before. funnily enough there was a time if you had both an os x partition for Mac os and an ext2/3 partition for Linux, the only way to share files freely between the two was through a fat32 partition.
I'm learning Motif just for fun with "Vol6a" and it's fairly easier than Xaw. When Motif was propietary,
GNU users did an astounding work with LessTif, as programming in Xaw was far more level and prone to bugs because you had to keep more stuff set.
Also, Xaw3D was a good effort on mimicking Motif widgets, but it had the same exact API as Xaw.
And, finally, in late 90's TCL/TK had a big boost under GNU distros because it was a quick way to plug an UI over some C software.
If you like Motif, you might like EMWM too. WIth the suggest addons and extras, it's like a mini-DWM from IRIX minus the launching dock; altough you don't need it, the toolbar it's more than enough.
A pity Mosaic it's propietary; adding auth support for NNTP it's something every C novice could do under 2 minutes with a printf (and I did). But there's some nice TCL/TK software which just requires tcl-tls and a plain TCL/TK install:
EMWM looks fun. I've got a NetBSD laptop running CDE for nostalgia and it is surprisingly a joy to use. I don't think CDE has undergone the same protocol modernization and doing so would be a lot more work there.
"Ubuntu developer Simon Quigley laid out the plans for hoping Ubuntu packages will move from Qt 5 to Qt 6 so that by the time of the Ubuntu 26.04 LTS cycle in early 2026 that the older version of this graphical toolkit can be removed."
On the other, why does Ubuntu always do their craziest changes in the LTS? The LTS should boring, and the one after it should be the wild one. (That may not be what the plan is here, but it does sound like it)
I'm guessing it's because everything that ships in the LTS must be maintained for 12 years. I understand Canonical's desire to not still be maintainig Qt 5 in 2038.
Tho the ideal time to make this change would be in 25.10.
I think GP is intending to have it work as follows:
If canonical would like to remove QT5 from LTS vN, they need to remove it in the first non-LTS release after LTS v(N-1). That way there is a lot of time before it to adopt which increases the chances of it actually happening at vN.
The value of LTS, and the reason companies spend tonnes of money on support contracts for those 12 years, is exactly this though: the set of libraries shipped in 26.04 will still be supported 12 years later without major backward incompatible changes. The very reason people would pay to keep using 26.04 in the mid 2030s and beyond would be that the software they built or bought in 2026 keeps working, typically without even having to recompile.
You may argue that those customers should just update their software to use new versions of libraries or frameworks or replace discontinued libraries with new ones, or pester their vendor to do those things, or pay a bunch of money to consultancy companies to do those things; and you may well be right, but as long as those companies won't do that and are willing to instead pay Canonical, why wouldn't Canonical accept the money?
Vendors don't always want to do this. I've seen 6-figure instruments stop working after a major update, and when we go to the vendor for an updated version of the software, their response is "we don't support that hardware anymore; buy a new instrument."
In enterprise settings 12 years is not quite enough. There are racks of boxen set up with custom software targeting a particular distro version and there is no money or value in making changes to make it compatible with something newer.
Hardware gets upgraded on 5-7 year cycles, but the software… isn’t happening.
I think this is reasonable. Plasma 6 recently appeared in Debian Experimental, Qt 6 is already in the Stable branch, and a new Debian release seems likely in the coming year. I expect the folks managing Ubuntu just want to make sure their non-Debian stuff is ready for the upstream change.
Forcing modernisation of some - (probably) especially gui - applications is overall a positive for the platform as it also identifies projects which are under-maintained and allow for overall modernisation / forking / maintenance that is needed.
I wouldn't expect to (easily) run php5 code on Ubuntu 26.04, either. Applications that refuse to modernise can build qt5 themselves.
> and allow for overall modernisation / forking / maintenance that is needed.
Or, it allows for pointless churn, extra work, and breakage of apps that were still functional until someone yanked their dependencies. This kind of thing is why software rot is even a thing and people remarking that Win32 Is The Only Stable ABI on Linux ( https://blog.hiler.eu/win32-the-only-stable-abi/ ).
Alternatively, it is the best for the Linux desktop year to never come by always having major application totally reworked with the last shiny stuff instead of just polishing it with all the small bugs and features improved in the app.
Since you can identify the projects without mandatory deprecation, what does deprecation give you?
Also resources don't magically appear just because a resource constraint had been identified since that fact is also a given and obvious to everyone involved
That's why I stick to using sh. Wait no, bash... but make sure it's got all of the features from the 5.2 release... unless it's when I'm using zsh in which case we'll need to change some syntax around. For interactive use the program has an ncurses 6 interface, if you want to manage it that way, so make sure you aren't still back on ncurses 5 and have an xterm-256color terminal!
Jokes aside, stdout is more like the X11/Wayland comment than anything... and even then it's been what, >30 years with one backwards compatible swap there?
Where's the joke? I do in fact target somewhat oldish POSIX sh when writing shell scripts; I want to write a script once and not have to worry about where it will run, and sh is basically universal.
The joke was the over the top-ness in the example of how CLI solutions have lacked forward compatibility as often as GUI solutions. The latter parts about how both CLI and GUI have decent examples of long term backwards compatibility were serious.
I don't think it works that way either, though. AIUI the problem is that QT5 apps don't automatically work on QT6 (otherwise this would be a non-problem; Ubuntu would just ship QT6 alone and everything would work). But none of the shell examples given have that problem; AFAIK virtually every script that worked on BASH 4 works on BASH 5 - as does virtually every script that worked on any other older version of BASH, and every script that targets plain sh. If you wanted to make that argument, the ncurses suggestion by https://news.ycombinator.com/item?id=42020272 is probably more compelling than the shell itself, which is amazingly backwards compatible.
Sure, I'll give the particular example of bash compared to sh is probably a poor example to put focus on in that it's one that doesn't demonstrate every point just like SDL 1.2 to 2.0 wouldn't due to SDL_Compat (leaving some room for interpreted vs compiled differences to not count otherwise we'd have to restrict conversation to scripted GUIs instead of QT or compiled shell commands instead of scripts). The other example of ncurses 6 vs 5 is better as it's a compiled interface where the ABI broke but that was already in the comment. Or take the other example zsh vs bash on how there are often changes that take some porting work but not much.
More than each of the specific examples, the main point was meant to be more around things like stdout being comparable to display server protocols which do behave about the same in compatibility. Specific apps end up doing their own thing with compatibility regardless which interface they output on. One of the examples I chose was a poor example for that and seems to have distracted from that point, mea culpa. Another commenter noted Python might have been a better example of scripting compatibility over time.
You're right there, I should have used a Python on the CLI side as the example. Bash was a poor choice as it doesn't work with all of the points in one example. GUIs also vary on their out of the box compatibility, QT5 to QT6 is similar to Python in that that doesn't (without some work) just work though.
The lessons from (Qt3 to Qt4 and) Qt4 to Qt5 have been learned and moving a large project from Qt5 to Qt6 is not that hard comparatively. There are a few minor deprecated APIs to handle and it's relatively easy over all.
I even have a stable project that is compatible with Qt5 and Qt6 [1] all in a single code base (particularly thanks to the effort of the qtpy[2] library). It's not that hard, and my opinion includes C++ in that assessment.
You'll be hard pressed to find many CLI apps that still run properly on non-VT100 terminals, but back in the 80s and 90s software usually ran fine on a tvi912, wyse30, zenith22 AND vt100s as long as the termtype was set right.
Part of the benefit here though is that everyone has LOTS of options. Go run Debian or an LTS of something else. App packagers can appimage or flatpack. People can qt5 install from a PPA im sure.
The key thing here is that it lets packagers keep focusing on more impactful things than babysitting projects that can't/won't upgrade, especially when that babysitting ending frees up a lot of brainpower AND is reasonably easily picked up by people who still need that.
I hate dynamic linking and this is one of the reasons. APIs should be small, self-contained, not break backwards compatibility (except for security reasons). Libraries provide much more extensive features and, as such, need to change a lot more often.
APIs are: Linux system calls, the std{in,out,err} convention, termcap/terminfo, X/Wayland, D-Bus, PulseAudio/PipeWire etc.
These change very rarely, and when they do they generally offer compatibility layers (XWayland, pipewire-pulse).
But dynamic linking makes every single library an API that can break!
Qt isn't dying off. They're removing old versions of it from the distribution repositories. (Or well, trying to, anyways.)
Yes, it's neat that you can still compile and run ancient Win32 programs on modern Windows, but maintaining all of that compatibility is a burden Microsoft was able to bear. The open source community is still struggling to try to provide a compelling desktop experience without this burden.
Can you run ancient old Linux graphical software? Well, to an extent... Yes. Is it easy? ...No. At least, it's certainly not as easy as installing stuff from your OS vendor directly. As far as Wayland goes, there are no plans to get rid of XWayland any time soon. I'd wager it will remain in the toolkit for decades to come, and if anything, XWayland support keeps getting better rather than worse; just look at what KDE has been doing.
Usually, when I want to try to run something really old I'll use the Docker debian/eol[1] images. Getting X11 apps to work inside of Docker requires a bit of fidgeting, but once you have it working, it's relatively straight-forward to install or build and use software inside of.
The Linux desktop is necessarily going to keep going through large breaking shifts because well, it sucks. It needs to go through breakages. When the new thing also seems to suck, that hints that it might need to continue to break more. It's going to take a while to iron things out, especially because designing systems is hard and sometimes new ideas don't wind up panning out very well. (This has caused a lot of trouble with Wayland where a lot of ideas, indeed, did not pan out super well in practice in my opinion.)
That said, I think we can do much work to improve interoperability with older software, and ensure that it can continue to run in the future. But, it doesn't have to, and really, shouldn't work by keeping everything else from moving.
I find it amusing that there’s this constant assetion on this forum that the Linux desktop sucks
Yet many of us have been quite happy with it for decades, and the major problems tend to be with new stuff that seems to be added just to do something different.
The Linux desktop is pretty great, for the most part.
I would say that in addition to a usable desktop, Linux distributions generally also provide a more stable Windows environment than Windows itself does. I have quite a suite of games in Steam and GoG, and I find that many older games that simply cannot be coaxed into running on Windows can be made to run happily on one version of Wine/Proton or another.
There are still some weak areas in desktop Linux if you're running a desktop that isn't Gnome or Plasma - wifi configuration comes to mind immediately, as does sound configuration. Bluetooth can be picky if you don't know to get all the non-free firmware.
But really, we're living in a dream era of Linux desktops, if you happen to have lived through the early days of `startx` and XFree86. Hands up if you ever smoked a monitor coil with a bad X config file...
Coming from Windows and Mac, the multiple monitors experience on Linux is terrible, especially when the monitors have different refresh rates or different resolutions. It's getting a bit better with Ubuntu 24.10 but still not close to the smooth experience I have on Windows and Mac.
Recently I needed to switch to Wayland due to my monitors setup. Then I also needed to write a small program that capture application screen, which is a stupidly simple thing to do on Windows, Mac, or even on Xorg. For Wayland I had to jump through bunch of stupid hoops just to get it kinda working ...
Having to deal with windows on Windows regularly being placed on a monitor which didn't exist and windows on MacOS restoring to weird places... None of them are great. You get to choose your preferred set of problems.
I have been using Linux as my primary operating system for two decades, and I think the Linux desktop sucks. I strongly disagree that my problems with it are largely the result of "new stuff that seems to be added just to do something different".
When I first started using Linux, there was no kernel modesetting and XFree86 ran as root. I had to manually configure my graphics and display settings by editing the Xorg.conf in nano, or sometimes I could get YaST to do it. Multi-head output was not well-supported, and you often had to restart to reconfigure your outputs. Installing the NVIDIA driver was a straight up pain in the ass. (Some things never change.) X.Org eventually improved and added hotplugging. The Linux desktop stack went through a lot of "new stuff" like new versions of DRM/DRI, the addition of hotplugging, the introduction of KMS and libinput. Dynamic input configuration was added with the `xinput` command. The XInput2 extension was developed to greatly improve touch and pen tablet support. XRandr made it possible to dynamically arrange displays, replacing Xinerama. UI toolkits needed to be rewritten to support scalable rendering so that we could have high DPI support. Most recently, a lot of work and many new protocols across multiple parts of the desktop stack had to be re-hauled from kernel to userland to support explicit synchronization for graphics tasks and buffers. All of this required major shifts, and that broke user's workflows at some points. Plenty of people were broken by the gradual transition to using libinput, from graphics tablets to Synaptics touchpads, a lot of work had to be redone in order to re-haul the input system. Only recently are there Linux laptops where the out-of-the-box touchpad experience is good enough to rival a MacBook, and it's certainly not all of them.
Don't get me wrong, I loved KDE 3. And despite all of it's flaws, I still prefer Linux greatly over the other options available. But would I go back to using Linux 2.6 and KDE 3 and etc. today? Fuck no, we've improved so many things since then. And yes, no question that at times things have felt like they've been moving backwards in some regards, but it really isn't for nothing. The fact that modern desktop Linux trivially hotplugs displays with different DPIs and refresh rates (and potentially variable refresh rates) is genuinely something that took a herculean effort. All of that work required massive rewrites. It required kernel changes, graphics driver changes, it required major UI toolkit changes, it required redesigning apps and rewriting components.
And yes, I admit I am not the biggest contributor to open source or anything, my patches are mostly unimportant. But, when I hit something that sucks in the Linux desktop, I do try my best to see if I can't do something about it. I have contributed random little bits here and there to various projects I care about. Most recently, I've been doing work to improve some situations where thumbnails in KDE are suboptimal. There's easily thousands of these little small issues that make Linux worse to use on desktop, we've got plenty of work to do.
And I make it work. Except for work-owned computers, all of my machines run Linux, and I have a fair number of them. I do not have a single box I own personally that boots any other kernel or any other desktop. I know my way around Wine, even enough to make occasional, if small, code contributions. (For example, I made a quick patch when graphics tablets were not working in Wine under XWayland.) So I'm not suggesting that it's unusable, but I can't recommend the Linux desktop to random people. It's been getting closer, but a lot of the reason it's getting closer is because of some of the efforts that I assume you'd complain about.
> Yes, it's neat that you can still compile and run ancient Win32 programs on modern Windows, but maintaining all of that compatibility is a burden Microsoft was able to bear. The open source community is still struggling to try to provide a compelling desktop experience without this burden.
I always find it interesting how people talk about this as if Microsoft is just flipping a button rather than spending many millions of dollars on engineers to maintain compatibility. I’m reminded of the systemd arguments where the number of people who are vehement about the old ways being better just never seem to have time to show up to support them. This kind of stuff is expensive and the way to influence the decisions is to show up and chip in.
> just never seem to have time to show up to support them.
They're supported just fine. The authors feel no need to bend over backwards to fulfill oddball distribution requirements.
> This kind of stuff is expensive
Yet it all came into existence from open source developers who were paid nothing. Then a bunch of commercial distributions appeared. That's why it's expensive. They're twisting the community for their own profits rather than for broad improvement of the system as a whole.
> to influence the decisions is to show up and chip in.
I don't want to influence things or be required to. I would prefer if people who had "influence" just weren't part of the community. They're exceptionally disruptive and often not focused on users but on their own stature.
> The authors feel no need to bend over backwards to fulfill oddball distribution requirements.
This is close to the truth in my opinion, but slightly off in two ways:
- It's not just distributions, it's also other software, especially desktop environments.
- The "oddball requirements" are not so odd and mainly based on real world problems users run into.
There are problems that SysVinit will never try to solve, yet that there is no other obvious place to solve except for pid 1, so systems that continue to run SysVinit will just never have solutions for those problems. And I get it: You don't care. However, the people who actually work on free software often do. You don't have to love systemd, D-Bus, Polkit, UPower, UDisks, Pipewire, Wayland or any other number of pieces of software, protocols, or specifications, but you have to be in pretty strong denial to not see what lead to them. The entire Linux desktop can't be built on janky shell scripts that parse CLI tool output forever. Eventually, people want to move on and build more complete solutions to problems.
Distributions exert pressure on the ecosystem because they're the ingress for all of the user pain. A vast amount of issues reported go through them first, and then it's up to them to figure out what to do about it. So if you have a problem with anyone, it's probably not actually the distributions themselves, but the kinds of things the users of those distributions are complaining about.
> Yet it all came into existence from open source developers who were paid nothing. Then a bunch of commercial distributions appeared. That's why it's expensive. They're twisting the community for their own profits rather than for broad improvement of the system as a whole.
This is bizarrely counter-factual. There are community contributors but an awful lot of open source is contributed by people who are paid to work on it (which, to be clear, is awesome!), even in the case of Debian.
It’s also odd in the context of Qt which has been commercially supported since the beginning when Trolltech first released it in 1995. There’s always been the idea that if you want long-term support for old versions you should pay for it rather than expecting the open source community to spend time on old code.
> I don't want to influence things or be required to. I would prefer if people who had "influence" just weren't part of the community. They're exceptionally disruptive and often not focused on users but on their own stature.
You’re welcome to disrespect them but they have influence because they show up: in open source your voice carries weight in proportion to your contributions. It’s also inaccurate to say they’re not focused on users: you might not think you need a given feature but you’re not the only user out there. Alternatives are there but most people aren’t using them because they don’t feel the need to switch as strongly as you might.
> in open source your voice carries weight in proportion to your contributions.
And your contributions grow in proportion to your income, and a lot of people get their income from companies who have other than the best interests of Linux users at heart, and rather want to steer it in a direction that is most profitable for themselves.
That's certainly fine for open source, but that's not fine for these distros as a whole, at least the ones that portray themselves as a service to the public.
> It’s also inaccurate to say they’re not focused on users: you might not think you need a given feature but you’re not the only user out there.
If your voice carries weight in proportion to your contributions, users are not part of that equation. Users enter the equation if they're the users that are desirable for the people who are paying contributors.
> Alternatives are there but most people aren’t using them because they don’t feel the need to switch as strongly as you might.
And because people who insist that Linux move in a particular direction immediately start breaking things that used to work, and making their new thing a dependency. Then the alternatives have to spend all their time shimming that stuff, and the best they can hope for is to almost keep up. The prize for becoming an alternative is having to become an expert on the new thing that you disapproved of, or fall behind. You never get time to work on more parsimonious solutions to the problems the new thing claimed to solve.
> rather than expecting the open source community to spend time on old code.
I don't expect them to "spend time" on it, how about, just don't deprecate it prematurely? Leave users a choice. Then again this why I wouldn't use most of the distributions because they left user choice on the floor a decade ago.
> in open source your voice carries weight in proportion to your contributions.
Their voice should not carry any weight. It's not at all clear that people who contribute code are good at predicting or managing future organizational outcomes correctly. It also creates a model where those who develop out of passion rather than for a paycheck are significantly disadvantaged for no meaningful reason.
Every engineering society that had started with this mechanism has replaced it with something more rationalized. It's the only way to create sufficiently advanced structure and to ensure that new ideas are quickly adapted and made available.
> It’s also inaccurate to say they’re not focused on users
The mere existence of this thread ruins this thesis.
> but most people aren’t using them
Define "use." As in made a conscious effort to get a specific piece of software and use it? Or just ended up with the default piece of software their distribution choose for them? Are you sure you're measuring the correct variables?
In any case, I just don't appreciate this bullying tone, where users have to "shut up and write code" or "just put up with whatever we hand you" whenever the _slightest_ bit of feedback or requests are made on behalf of legacy users and systems. Which might be fine, if the people busily writing code all day weren't intentionally making it more difficult to do just that.
It happens more in other communities but it is absolutely noticeable. It's not enough to simply create a new good project, there's an apparent need to destroy any prior older projects and actively prevent people from using them.
Which is why your entire "voice = influence" model is a problem.
Well, that's because deep enterpise users have closed sourced software that cannot be updated (pick any reason, it doesn't matter which one) and they need that compatibility. MS needs to keep them vendor-locked to keep making money off them.
That's not something that happens in Open Source Software world.
Yes. My point was just that it’s not free but it’s weird that done people expect the same level of support from open source projects which don’t receive anything like that much money.
> The open source community is still struggling to try to provide a compelling desktop experience without this burden.
What exactly is a "compelling desktop experience?" It seems to me that when people say this they really mean "win the popularity contest against Microsoft." I'm not sure that's at all a worthwhile or laudable goal.
> Is it easy? ...No.
It's incredibly easy. It just depends on what distribution you use. For example, on Gentoo, this is not at all a problem.
> When the new thing also seems to suck,
This is a hint that you got something wrong. Remember when desktops used to be configurable? I guess letting the user control their own environment stopped being "compelling" when Microsoft decided so. Do we have to follow suit?
> What exactly is a "compelling desktop experience?" It seems to me that when people say this they really mean "win the popularity contest against Microsoft." I'm not sure that's at all a worthwhile or laudable goal.
I don't care about Microsoft or Apple. I don't like Windows and I don't like macOS. I care about my desktop. The one that I am typing in. I really, really hope I don't actually have to explain all of the problems with the Linux desktop experience right now, I don't have the energy. I've helped a lot of people try to make Linux work for them and it's a soul-crushing experience every time because I often have to explain that there is in fact, no great options for them right now because shit is simply broken. The X11 and Wayland situation is a perfect example: "Oh no problem. You can just choose between your windows scaling properly and your high refresh rate monitor working, and being able to actually use screensharing during meetings." Sure it's getting better, but we've got a lot of work to go, there's no sugarcoating it.
> It's incredibly easy. It just depends on what distribution you use. For example, on Gentoo, this is not at all a problem.
Look, if the person has to be proficient enough to be able to install and operate Gentoo as their primary desktop operating system, I'm pretty sure they do not need my help when it comes to doing virtually anything with Linux. It's also pretty easy to run old versions of software on NixOS but I'd argue this is basically cheating.
What's hard to do is take any of the most popular distros and run something from 1999. (I'm pretty sure that's actually generally hard to do on Gentoo as well, but whatever.) That's what I'm really talking about.
> This is a hint that you got something wrong. Remember when desktops used to be configurable? I guess letting the user control their own environment stopped being "compelling" when Microsoft decided so. Do we have to follow suit?
What exactly about modern KDE isn't configurable enough? I use both KDE and SwayWM as daily driver desktop setups and never felt hindered much by the available configuration options. If anything, KDE offers so much customization that it's probably sometimes detrimental.
Configuration options come at a cost, and that's exactly why modern software has less of it. We want higher quality software, but it's hard to make software more robust when you have trillions of code paths and most of them are almost completely unused. This stuff adds up. In a similar vein, I hate to see Linux lose support for old hardware, but if something's been broken in mainline for five years straight, I'm pretty sure they have every reason to assume it's not being used by anyone.
What I'm really talking about with "the new thing also sucks" is more reflecting on the growing pains of things like Wayland and PulseAudio (and a bit with Pipewire though less so.) I am of course not saying that the same thing doesn't happen with desktop environments, personally I don't like GNOME 3, but also in the same vein, it'd be hard to argue GNOME 3 has been getting worse this whole time, it offers much more customization and is a lot more robust than it was when it first launched.
> I've helped a lot of people try to make Linux work for them
Okay, do you care about _your_ desktop, or having as many people use Linux as a desktop as possible? Have you considered that these two facts may actually be at odds with each other?
> We want higher quality software
We're getting further and further away from your desktop.
> Configuration options come at a cost, and that's exactly why modern software has less of it.
Wait.. so software got less expensive to develop.. and yet we're concerned about configuration options.. so they're being _reduced_ from what they _historically_ were?
> but it's hard to make software more robust
This is just an exercise in goal post moving at this point.
> Okay, do you care about _your_ desktop, or having as many people use Linux as a desktop as possible? Have you considered that these two facts may actually be at odds with each other?
How does trying to help people use Linux have anything to do with Windows or macOS? I don't tell people to use Linux, I tell people not to use Linux, because it sucks. When people do it anyways, they frequently come to me for help because I have more Linux desktop experience than is humanly justifiable.
> We're getting further and further away from your desktop.
My desktop has software on it.
> Wait.. so software got less expensive to develop.. and yet we're concerned about configuration options.. so they're being _reduced_ from what they _historically_ were?
In what way did software get less expensive to develop?
> This is just an exercise in goal post moving at this point.
The goal posts are in your head. I'm not playing some kind of game. You are.
Both in the sense that Win32 GUIs (introduced in Windows 95) are a bit out of style but are still supported and widely used; and Powershell being a major departure from traditional CLIs.
I guess that's what I get for being so terse - I actually meant it as a counterexample. Windows is a good illustration of what happens when you avoid deprecating older systems: you end up with multiple design philosophies and interfaces layered on top of each other, leading to a less cohesive user experience overall.
Canonical, the company behind Ubuntu, could easily port the remaining Qt5 apps to Qt6 by the deadline. But they rather wait for the community to do it for them.
What is happening though is really long support for older releases being provided by Canonical so you will be able to use older Qt 5 based apps on Ubuntu, security supported, until 2034, on Canonical's dime. With container technology (something that Canonical significantly funded, with Docker originally being based on Canonical-funded LXC) you'll be able to run those older apps on a newer base OS including future ones, too.
Those are huge contributions to the user community you seem to be discounting.
Disclosure: I work for Canonical. I don't speak for Canonical. But what I say above are simple verifiable and falsifiable facts and not really a matter of opinion.
Seems like the root misstep is when extending server support to 10 years they extended GUI app support as well. The two probably shouldn't be linked and GUI support should end with the desktop edition at five years.
I sure as hell would prefer to not have to maintain Qt 5 for 13 more years if I could avoid it. I don't think this has anything to do with "being in cahoots with the Qt company", I think it has everything to do with "backporting fixes from a recent Qt 6 version to an older Qt 6 version is easier than backporting fixes from Qt 6 to Qt 5".
I have always avoided Qt for any projects because of the incomprehensible licensing model that could very easily create a legal minefield for your product.
I think it makes sense to try and cycle the majority of the dependency network to the new versions but I also think it probably makes sense to let the old toolkits live on indefinitely (maybe strategically drop the webkit stuff everything else should be low maintenance) for a variety of reasons, particularly old commercial software.
It is comical in some sense that a Motif app of yore would've been a great "investment" in terms of still functioning now and for the foreseeable future.. sort of like Win32/MFC.
> Motif app of yore would've been a great "investment" ... sort of like Win32/MFC.
I've slowly started getting back to desktop Linux, and honestly Motif (mwm) is fine for my needs. Maybe fvwm would be slightly better.
If I where to do my career over, I'd consider going for C++ and Win32, it seems like a stable performant platform (I get that there's other problems on modern Windows).
wine is the stable abi on Linux.
For those who don't know the reference: https://blog.hiler.eu/win32-the-only-stable-abi/
oh I'm pretty sure I've seen that a long time before. funnily enough there was a time if you had both an os x partition for Mac os and an ext2/3 partition for Linux, the only way to share files freely between the two was through a fat32 partition.
Are you referring to exFAT?
that's only since 2009 on Linux and snow leopard in 2010. in the years before, you had to do fat32.
> wine is the stable abi on Linux.
When it works, yes. Unfortunately, the quality of open source software, especially of big projects, has gone downhill in the later years.
I'm using mwm right now, and a motif build of emacs.
EMWM there, with a Vim build of Motif too.
https://fastestcode.org/emwm.html
So there must be at least three of us. Hi!
Hello. BTW, if there's no Motif tool available, there's always TCL/TK. As an example, TKDiff.
Or a Gemini/Gopher client:
https://codeberg.org/luxferre/BFG
It just requieres TCL/TK and tcl-tls. It's run faster than compiled browsers, even if the browsers have Javascript disabled. Services:
gemini://gemi.dev News Waffle, perfect for newspapers, no ads, no 700 cookies tracking you.
gopher://magical.fish Big portal, good news service
gopher://hngopher.com HN
gopher://gopherddit.com Reddit
Motif and Xaw will bury us all!
I'm learning Motif just for fun with "Vol6a" and it's fairly easier than Xaw. When Motif was propietary, GNU users did an astounding work with LessTif, as programming in Xaw was far more level and prone to bugs because you had to keep more stuff set.
Also, Xaw3D was a good effort on mimicking Motif widgets, but it had the same exact API as Xaw. And, finally, in late 90's TCL/TK had a big boost under GNU distros because it was a quick way to plug an UI over some C software.
If you like Motif, you might like EMWM too. WIth the suggest addons and extras, it's like a mini-DWM from IRIX minus the launching dock; altough you don't need it, the toolbar it's more than enough.
A pity Mosaic it's propietary; adding auth support for NNTP it's something every C novice could do under 2 minutes with a printf (and I did). But there's some nice TCL/TK software which just requires tcl-tls and a plain TCL/TK install:
https://codeberg.org/luxferre/BFG
EMWM looks fun. I've got a NetBSD laptop running CDE for nostalgia and it is surprisingly a joy to use. I don't think CDE has undergone the same protocol modernization and doing so would be a lot more work there.
CDE got XFT thanks to Motif support but I'm not sure about UTF-8.
I have to try the UTF-8 test file with Nedit-XFT. EMWM should perfectly work then.
[dead]
From the article:
"Ubuntu developer Simon Quigley laid out the plans for hoping Ubuntu packages will move from Qt 5 to Qt 6 so that by the time of the Ubuntu 26.04 LTS cycle in early 2026 that the older version of this graphical toolkit can be removed."
Plans for hoping...
That is no plan at all, it is wait and see.
I think the term of art is currently concepts of a plan.
Wait and see is a plan!
Not one that is likely to result in an Ubuntu without Qt 5. But a plan all the same.
On the one hand, yes.
On the other, why does Ubuntu always do their craziest changes in the LTS? The LTS should boring, and the one after it should be the wild one. (That may not be what the plan is here, but it does sound like it)
I'm guessing it's because everything that ships in the LTS must be maintained for 12 years. I understand Canonical's desire to not still be maintainig Qt 5 in 2038.
Tho the ideal time to make this change would be in 25.10.
I think GP is intending to have it work as follows:
If canonical would like to remove QT5 from LTS vN, they need to remove it in the first non-LTS release after LTS v(N-1). That way there is a lot of time before it to adopt which increases the chances of it actually happening at vN.
Perhaps they need to reconsider how an LTS works?
The value of LTS, and the reason companies spend tonnes of money on support contracts for those 12 years, is exactly this though: the set of libraries shipped in 26.04 will still be supported 12 years later without major backward incompatible changes. The very reason people would pay to keep using 26.04 in the mid 2030s and beyond would be that the software they built or bought in 2026 keeps working, typically without even having to recompile.
You may argue that those customers should just update their software to use new versions of libraries or frameworks or replace discontinued libraries with new ones, or pester their vendor to do those things, or pay a bunch of money to consultancy companies to do those things; and you may well be right, but as long as those companies won't do that and are willing to instead pay Canonical, why wouldn't Canonical accept the money?
Vendors don't always want to do this. I've seen 6-figure instruments stop working after a major update, and when we go to the vendor for an updated version of the software, their response is "we don't support that hardware anymore; buy a new instrument."
And this is why RHEL4, NT4, OS/2, and other long-buried OSes are still in use in industrial applications.
In enterprise settings 12 years is not quite enough. There are racks of boxen set up with custom software targeting a particular distro version and there is no money or value in making changes to make it compatible with something newer.
Hardware gets upgraded on 5-7 year cycles, but the software… isn’t happening.
As the saying goes, "Old hardware goes to the scrap heap, old software goes into production tonight."
> Hardware gets upgraded on 5-7 year cycles, but the software… isn’t happening.
Software gets rewritten every 1-3 years throwing away all the past experience. (Qt, GTK, and so on and so forth). /s
I think this is reasonable. Plasma 6 recently appeared in Debian Experimental, Qt 6 is already in the Stable branch, and a new Debian release seems likely in the coming year. I expect the folks managing Ubuntu just want to make sure their non-Debian stuff is ready for the upstream change.
Forcing modernisation of some - (probably) especially gui - applications is overall a positive for the platform as it also identifies projects which are under-maintained and allow for overall modernisation / forking / maintenance that is needed.
I wouldn't expect to (easily) run php5 code on Ubuntu 26.04, either. Applications that refuse to modernise can build qt5 themselves.
> and allow for overall modernisation / forking / maintenance that is needed.
Or, it allows for pointless churn, extra work, and breakage of apps that were still functional until someone yanked their dependencies. This kind of thing is why software rot is even a thing and people remarking that Win32 Is The Only Stable ABI on Linux ( https://blog.hiler.eu/win32-the-only-stable-abi/ ).
Ubuntu does not claim they have a stable ABI between major version releases.
Sure? Acknowledging a shortcoming of an OS doesn't remove it.
Alternatively, it is the best for the Linux desktop year to never come by always having major application totally reworked with the last shiny stuff instead of just polishing it with all the small bugs and features improved in the app.
Since you can identify the projects without mandatory deprecation, what does deprecation give you?
Also resources don't magically appear just because a resource constraint had been identified since that fact is also a given and obvious to everyone involved
Imagine if CLIs routinely died off and had to be rotated out.
Sorry you are running stdout from 2022. You need 2025 stdout to print text.
I guess you can always bundle Qt with your app but the X11 / Wayland split might have complicated that? I just hate it. Viscerally I hate it.
That's why I stick to using sh. Wait no, bash... but make sure it's got all of the features from the 5.2 release... unless it's when I'm using zsh in which case we'll need to change some syntax around. For interactive use the program has an ncurses 6 interface, if you want to manage it that way, so make sure you aren't still back on ncurses 5 and have an xterm-256color terminal!
Jokes aside, stdout is more like the X11/Wayland comment than anything... and even then it's been what, >30 years with one backwards compatible swap there?
Where's the joke? I do in fact target somewhat oldish POSIX sh when writing shell scripts; I want to write a script once and not have to worry about where it will run, and sh is basically universal.
The joke was the over the top-ness in the example of how CLI solutions have lacked forward compatibility as often as GUI solutions. The latter parts about how both CLI and GUI have decent examples of long term backwards compatibility were serious.
I don't think it works that way either, though. AIUI the problem is that QT5 apps don't automatically work on QT6 (otherwise this would be a non-problem; Ubuntu would just ship QT6 alone and everything would work). But none of the shell examples given have that problem; AFAIK virtually every script that worked on BASH 4 works on BASH 5 - as does virtually every script that worked on any other older version of BASH, and every script that targets plain sh. If you wanted to make that argument, the ncurses suggestion by https://news.ycombinator.com/item?id=42020272 is probably more compelling than the shell itself, which is amazingly backwards compatible.
Sure, I'll give the particular example of bash compared to sh is probably a poor example to put focus on in that it's one that doesn't demonstrate every point just like SDL 1.2 to 2.0 wouldn't due to SDL_Compat (leaving some room for interpreted vs compiled differences to not count otherwise we'd have to restrict conversation to scripted GUIs instead of QT or compiled shell commands instead of scripts). The other example of ncurses 6 vs 5 is better as it's a compiled interface where the ABI broke but that was already in the comment. Or take the other example zsh vs bash on how there are often changes that take some porting work but not much.
More than each of the specific examples, the main point was meant to be more around things like stdout being comparable to display server protocols which do behave about the same in compatibility. Specific apps end up doing their own thing with compatibility regardless which interface they output on. One of the examples I chose was a poor example for that and seems to have distracted from that point, mea culpa. Another commenter noted Python might have been a better example of scripting compatibility over time.
Doesn't work.
A 30 year old sh script still works.
A 1 year old qt5 app or python script does not.
You're right there, I should have used a Python on the CLI side as the example. Bash was a poor choice as it doesn't work with all of the points in one example. GUIs also vary on their out of the box compatibility, QT5 to QT6 is similar to Python in that that doesn't (without some work) just work though.
Qt5 to Qt6 is nothing like Python2 to Python3.
The lessons from (Qt3 to Qt4 and) Qt4 to Qt5 have been learned and moving a large project from Qt5 to Qt6 is not that hard comparatively. There are a few minor deprecated APIs to handle and it's relatively easy over all.
I even have a stable project that is compatible with Qt5 and Qt6 [1] all in a single code base (particularly thanks to the effort of the qtpy[2] library). It's not that hard, and my opinion includes C++ in that assessment.
[1] https://github.com/git-cola/git-cola/
[2] https://github.com/spyder-ide/qtpy
> The lessons from (Qt3 to Qt4 and) Qt4 to Qt5 have been learned
Have the lessons been learned ? They say this all the time when they release a "new" version.
Image if stdout still required a physical UNIX teletype from 1970's actually work.
Did it ever, or was that the tty abstraction rather than stdout?
Of course, C only came to be after six editions of UNIX.
Fourth edition UNIX was written in C in 1973:
https://gunkies.org/wiki/UNIX_Fourth_Edition
Unless we have different definitions of "edition".
Counterpoint: libncurses5 vs libncurses6.
You'll be hard pressed to find many CLI apps that still run properly on non-VT100 terminals, but back in the 80s and 90s software usually ran fine on a tvi912, wyse30, zenith22 AND vt100s as long as the termtype was set right.
So in a sense, it do be like that, somewhat.
Part of the benefit here though is that everyone has LOTS of options. Go run Debian or an LTS of something else. App packagers can appimage or flatpack. People can qt5 install from a PPA im sure.
The key thing here is that it lets packagers keep focusing on more impactful things than babysitting projects that can't/won't upgrade, especially when that babysitting ending frees up a lot of brainpower AND is reasonably easily picked up by people who still need that.
I hate dynamic linking and this is one of the reasons. APIs should be small, self-contained, not break backwards compatibility (except for security reasons). Libraries provide much more extensive features and, as such, need to change a lot more often.
APIs are: Linux system calls, the std{in,out,err} convention, termcap/terminfo, X/Wayland, D-Bus, PulseAudio/PipeWire etc.
These change very rarely, and when they do they generally offer compatibility layers (XWayland, pipewire-pulse).
But dynamic linking makes every single library an API that can break!
Qt isn't dying off. They're removing old versions of it from the distribution repositories. (Or well, trying to, anyways.)
Yes, it's neat that you can still compile and run ancient Win32 programs on modern Windows, but maintaining all of that compatibility is a burden Microsoft was able to bear. The open source community is still struggling to try to provide a compelling desktop experience without this burden.
Can you run ancient old Linux graphical software? Well, to an extent... Yes. Is it easy? ...No. At least, it's certainly not as easy as installing stuff from your OS vendor directly. As far as Wayland goes, there are no plans to get rid of XWayland any time soon. I'd wager it will remain in the toolkit for decades to come, and if anything, XWayland support keeps getting better rather than worse; just look at what KDE has been doing.
Usually, when I want to try to run something really old I'll use the Docker debian/eol[1] images. Getting X11 apps to work inside of Docker requires a bit of fidgeting, but once you have it working, it's relatively straight-forward to install or build and use software inside of.
The Linux desktop is necessarily going to keep going through large breaking shifts because well, it sucks. It needs to go through breakages. When the new thing also seems to suck, that hints that it might need to continue to break more. It's going to take a while to iron things out, especially because designing systems is hard and sometimes new ideas don't wind up panning out very well. (This has caused a lot of trouble with Wayland where a lot of ideas, indeed, did not pan out super well in practice in my opinion.)
That said, I think we can do much work to improve interoperability with older software, and ensure that it can continue to run in the future. But, it doesn't have to, and really, shouldn't work by keeping everything else from moving.
[1]: https://hub.docker.com/r/debian/eol
I find it amusing that there’s this constant assetion on this forum that the Linux desktop sucks
Yet many of us have been quite happy with it for decades, and the major problems tend to be with new stuff that seems to be added just to do something different.
The Linux desktop is pretty great, for the most part.
I would say that in addition to a usable desktop, Linux distributions generally also provide a more stable Windows environment than Windows itself does. I have quite a suite of games in Steam and GoG, and I find that many older games that simply cannot be coaxed into running on Windows can be made to run happily on one version of Wine/Proton or another.
There are still some weak areas in desktop Linux if you're running a desktop that isn't Gnome or Plasma - wifi configuration comes to mind immediately, as does sound configuration. Bluetooth can be picky if you don't know to get all the non-free firmware.
But really, we're living in a dream era of Linux desktops, if you happen to have lived through the early days of `startx` and XFree86. Hands up if you ever smoked a monitor coil with a bad X config file...
Coming from Windows and Mac, the multiple monitors experience on Linux is terrible, especially when the monitors have different refresh rates or different resolutions. It's getting a bit better with Ubuntu 24.10 but still not close to the smooth experience I have on Windows and Mac.
Recently I needed to switch to Wayland due to my monitors setup. Then I also needed to write a small program that capture application screen, which is a stupidly simple thing to do on Windows, Mac, or even on Xorg. For Wayland I had to jump through bunch of stupid hoops just to get it kinda working ...
Having to deal with windows on Windows regularly being placed on a monitor which didn't exist and windows on MacOS restoring to weird places... None of them are great. You get to choose your preferred set of problems.
That's not a contradiction; Linux distros have problems, they just suck differently and probably less than Darwin or NT.
I use Ubuntu 24.04 LTS for work. I used the previous version until a few days ago.
I had no resistance from my corporate overlords over this. I get my work done and I'm more effective as a developer.
The corporate MS apps all work in a browser. I don't use any of the office apps for anything serious.
The Linux desktop is here already and is accepted. Also KDE is quite nice and snappy most of the time.
I have been using Linux as my primary operating system for two decades, and I think the Linux desktop sucks. I strongly disagree that my problems with it are largely the result of "new stuff that seems to be added just to do something different".
When I first started using Linux, there was no kernel modesetting and XFree86 ran as root. I had to manually configure my graphics and display settings by editing the Xorg.conf in nano, or sometimes I could get YaST to do it. Multi-head output was not well-supported, and you often had to restart to reconfigure your outputs. Installing the NVIDIA driver was a straight up pain in the ass. (Some things never change.) X.Org eventually improved and added hotplugging. The Linux desktop stack went through a lot of "new stuff" like new versions of DRM/DRI, the addition of hotplugging, the introduction of KMS and libinput. Dynamic input configuration was added with the `xinput` command. The XInput2 extension was developed to greatly improve touch and pen tablet support. XRandr made it possible to dynamically arrange displays, replacing Xinerama. UI toolkits needed to be rewritten to support scalable rendering so that we could have high DPI support. Most recently, a lot of work and many new protocols across multiple parts of the desktop stack had to be re-hauled from kernel to userland to support explicit synchronization for graphics tasks and buffers. All of this required major shifts, and that broke user's workflows at some points. Plenty of people were broken by the gradual transition to using libinput, from graphics tablets to Synaptics touchpads, a lot of work had to be redone in order to re-haul the input system. Only recently are there Linux laptops where the out-of-the-box touchpad experience is good enough to rival a MacBook, and it's certainly not all of them.
Don't get me wrong, I loved KDE 3. And despite all of it's flaws, I still prefer Linux greatly over the other options available. But would I go back to using Linux 2.6 and KDE 3 and etc. today? Fuck no, we've improved so many things since then. And yes, no question that at times things have felt like they've been moving backwards in some regards, but it really isn't for nothing. The fact that modern desktop Linux trivially hotplugs displays with different DPIs and refresh rates (and potentially variable refresh rates) is genuinely something that took a herculean effort. All of that work required massive rewrites. It required kernel changes, graphics driver changes, it required major UI toolkit changes, it required redesigning apps and rewriting components.
And yes, I admit I am not the biggest contributor to open source or anything, my patches are mostly unimportant. But, when I hit something that sucks in the Linux desktop, I do try my best to see if I can't do something about it. I have contributed random little bits here and there to various projects I care about. Most recently, I've been doing work to improve some situations where thumbnails in KDE are suboptimal. There's easily thousands of these little small issues that make Linux worse to use on desktop, we've got plenty of work to do.
And I make it work. Except for work-owned computers, all of my machines run Linux, and I have a fair number of them. I do not have a single box I own personally that boots any other kernel or any other desktop. I know my way around Wine, even enough to make occasional, if small, code contributions. (For example, I made a quick patch when graphics tablets were not working in Wine under XWayland.) So I'm not suggesting that it's unusable, but I can't recommend the Linux desktop to random people. It's been getting closer, but a lot of the reason it's getting closer is because of some of the efforts that I assume you'd complain about.
Decades? Really? The UX was so bad in 1998 I just went back to windows for another decade.
Windows was so bad in 1999 that I switched to Linux and have been happy with it since.
25 years ago I was quite happy with enlightenment but moved to blackbox/fluxbox. Went to xfce around 2010ish though.
Only issues I can think of that have every affected me were pulseaudio related.
> Yes, it's neat that you can still compile and run ancient Win32 programs on modern Windows, but maintaining all of that compatibility is a burden Microsoft was able to bear. The open source community is still struggling to try to provide a compelling desktop experience without this burden.
I always find it interesting how people talk about this as if Microsoft is just flipping a button rather than spending many millions of dollars on engineers to maintain compatibility. I’m reminded of the systemd arguments where the number of people who are vehement about the old ways being better just never seem to have time to show up to support them. This kind of stuff is expensive and the way to influence the decisions is to show up and chip in.
> just never seem to have time to show up to support them.
They're supported just fine. The authors feel no need to bend over backwards to fulfill oddball distribution requirements.
> This kind of stuff is expensive
Yet it all came into existence from open source developers who were paid nothing. Then a bunch of commercial distributions appeared. That's why it's expensive. They're twisting the community for their own profits rather than for broad improvement of the system as a whole.
> to influence the decisions is to show up and chip in.
I don't want to influence things or be required to. I would prefer if people who had "influence" just weren't part of the community. They're exceptionally disruptive and often not focused on users but on their own stature.
> The authors feel no need to bend over backwards to fulfill oddball distribution requirements.
This is close to the truth in my opinion, but slightly off in two ways:
- It's not just distributions, it's also other software, especially desktop environments.
- The "oddball requirements" are not so odd and mainly based on real world problems users run into.
There are problems that SysVinit will never try to solve, yet that there is no other obvious place to solve except for pid 1, so systems that continue to run SysVinit will just never have solutions for those problems. And I get it: You don't care. However, the people who actually work on free software often do. You don't have to love systemd, D-Bus, Polkit, UPower, UDisks, Pipewire, Wayland or any other number of pieces of software, protocols, or specifications, but you have to be in pretty strong denial to not see what lead to them. The entire Linux desktop can't be built on janky shell scripts that parse CLI tool output forever. Eventually, people want to move on and build more complete solutions to problems.
Distributions exert pressure on the ecosystem because they're the ingress for all of the user pain. A vast amount of issues reported go through them first, and then it's up to them to figure out what to do about it. So if you have a problem with anyone, it's probably not actually the distributions themselves, but the kinds of things the users of those distributions are complaining about.
> Yet it all came into existence from open source developers who were paid nothing. Then a bunch of commercial distributions appeared. That's why it's expensive. They're twisting the community for their own profits rather than for broad improvement of the system as a whole.
This is bizarrely counter-factual. There are community contributors but an awful lot of open source is contributed by people who are paid to work on it (which, to be clear, is awesome!), even in the case of Debian.
It’s also odd in the context of Qt which has been commercially supported since the beginning when Trolltech first released it in 1995. There’s always been the idea that if you want long-term support for old versions you should pay for it rather than expecting the open source community to spend time on old code.
> I don't want to influence things or be required to. I would prefer if people who had "influence" just weren't part of the community. They're exceptionally disruptive and often not focused on users but on their own stature.
You’re welcome to disrespect them but they have influence because they show up: in open source your voice carries weight in proportion to your contributions. It’s also inaccurate to say they’re not focused on users: you might not think you need a given feature but you’re not the only user out there. Alternatives are there but most people aren’t using them because they don’t feel the need to switch as strongly as you might.
> in open source your voice carries weight in proportion to your contributions.
And your contributions grow in proportion to your income, and a lot of people get their income from companies who have other than the best interests of Linux users at heart, and rather want to steer it in a direction that is most profitable for themselves.
That's certainly fine for open source, but that's not fine for these distros as a whole, at least the ones that portray themselves as a service to the public.
> It’s also inaccurate to say they’re not focused on users: you might not think you need a given feature but you’re not the only user out there.
If your voice carries weight in proportion to your contributions, users are not part of that equation. Users enter the equation if they're the users that are desirable for the people who are paying contributors.
> Alternatives are there but most people aren’t using them because they don’t feel the need to switch as strongly as you might.
And because people who insist that Linux move in a particular direction immediately start breaking things that used to work, and making their new thing a dependency. Then the alternatives have to spend all their time shimming that stuff, and the best they can hope for is to almost keep up. The prize for becoming an alternative is having to become an expert on the new thing that you disapproved of, or fall behind. You never get time to work on more parsimonious solutions to the problems the new thing claimed to solve.
> rather than expecting the open source community to spend time on old code.
I don't expect them to "spend time" on it, how about, just don't deprecate it prematurely? Leave users a choice. Then again this why I wouldn't use most of the distributions because they left user choice on the floor a decade ago.
> in open source your voice carries weight in proportion to your contributions.
Their voice should not carry any weight. It's not at all clear that people who contribute code are good at predicting or managing future organizational outcomes correctly. It also creates a model where those who develop out of passion rather than for a paycheck are significantly disadvantaged for no meaningful reason.
Every engineering society that had started with this mechanism has replaced it with something more rationalized. It's the only way to create sufficiently advanced structure and to ensure that new ideas are quickly adapted and made available.
> It’s also inaccurate to say they’re not focused on users
The mere existence of this thread ruins this thesis.
> but most people aren’t using them
Define "use." As in made a conscious effort to get a specific piece of software and use it? Or just ended up with the default piece of software their distribution choose for them? Are you sure you're measuring the correct variables?
In any case, I just don't appreciate this bullying tone, where users have to "shut up and write code" or "just put up with whatever we hand you" whenever the _slightest_ bit of feedback or requests are made on behalf of legacy users and systems. Which might be fine, if the people busily writing code all day weren't intentionally making it more difficult to do just that.
It happens more in other communities but it is absolutely noticeable. It's not enough to simply create a new good project, there's an apparent need to destroy any prior older projects and actively prevent people from using them.
Which is why your entire "voice = influence" model is a problem.
Well, that's because deep enterpise users have closed sourced software that cannot be updated (pick any reason, it doesn't matter which one) and they need that compatibility. MS needs to keep them vendor-locked to keep making money off them.
That's not something that happens in Open Source Software world.
Yes. My point was just that it’s not free but it’s weird that done people expect the same level of support from open source projects which don’t receive anything like that much money.
> The open source community is still struggling to try to provide a compelling desktop experience without this burden.
What exactly is a "compelling desktop experience?" It seems to me that when people say this they really mean "win the popularity contest against Microsoft." I'm not sure that's at all a worthwhile or laudable goal.
> Is it easy? ...No.
It's incredibly easy. It just depends on what distribution you use. For example, on Gentoo, this is not at all a problem.
> When the new thing also seems to suck,
This is a hint that you got something wrong. Remember when desktops used to be configurable? I guess letting the user control their own environment stopped being "compelling" when Microsoft decided so. Do we have to follow suit?
> What exactly is a "compelling desktop experience?" It seems to me that when people say this they really mean "win the popularity contest against Microsoft." I'm not sure that's at all a worthwhile or laudable goal.
I don't care about Microsoft or Apple. I don't like Windows and I don't like macOS. I care about my desktop. The one that I am typing in. I really, really hope I don't actually have to explain all of the problems with the Linux desktop experience right now, I don't have the energy. I've helped a lot of people try to make Linux work for them and it's a soul-crushing experience every time because I often have to explain that there is in fact, no great options for them right now because shit is simply broken. The X11 and Wayland situation is a perfect example: "Oh no problem. You can just choose between your windows scaling properly and your high refresh rate monitor working, and being able to actually use screensharing during meetings." Sure it's getting better, but we've got a lot of work to go, there's no sugarcoating it.
> It's incredibly easy. It just depends on what distribution you use. For example, on Gentoo, this is not at all a problem.
Look, if the person has to be proficient enough to be able to install and operate Gentoo as their primary desktop operating system, I'm pretty sure they do not need my help when it comes to doing virtually anything with Linux. It's also pretty easy to run old versions of software on NixOS but I'd argue this is basically cheating.
What's hard to do is take any of the most popular distros and run something from 1999. (I'm pretty sure that's actually generally hard to do on Gentoo as well, but whatever.) That's what I'm really talking about.
> This is a hint that you got something wrong. Remember when desktops used to be configurable? I guess letting the user control their own environment stopped being "compelling" when Microsoft decided so. Do we have to follow suit?
What exactly about modern KDE isn't configurable enough? I use both KDE and SwayWM as daily driver desktop setups and never felt hindered much by the available configuration options. If anything, KDE offers so much customization that it's probably sometimes detrimental.
Configuration options come at a cost, and that's exactly why modern software has less of it. We want higher quality software, but it's hard to make software more robust when you have trillions of code paths and most of them are almost completely unused. This stuff adds up. In a similar vein, I hate to see Linux lose support for old hardware, but if something's been broken in mainline for five years straight, I'm pretty sure they have every reason to assume it's not being used by anyone.
What I'm really talking about with "the new thing also sucks" is more reflecting on the growing pains of things like Wayland and PulseAudio (and a bit with Pipewire though less so.) I am of course not saying that the same thing doesn't happen with desktop environments, personally I don't like GNOME 3, but also in the same vein, it'd be hard to argue GNOME 3 has been getting worse this whole time, it offers much more customization and is a lot more robust than it was when it first launched.
> I don't care about Microsoft or Apple.
Yet you seem to:
> I've helped a lot of people try to make Linux work for them
Okay, do you care about _your_ desktop, or having as many people use Linux as a desktop as possible? Have you considered that these two facts may actually be at odds with each other?
> We want higher quality software
We're getting further and further away from your desktop.
> Configuration options come at a cost, and that's exactly why modern software has less of it.
Wait.. so software got less expensive to develop.. and yet we're concerned about configuration options.. so they're being _reduced_ from what they _historically_ were?
> but it's hard to make software more robust
This is just an exercise in goal post moving at this point.
> Okay, do you care about _your_ desktop, or having as many people use Linux as a desktop as possible? Have you considered that these two facts may actually be at odds with each other?
How does trying to help people use Linux have anything to do with Windows or macOS? I don't tell people to use Linux, I tell people not to use Linux, because it sucks. When people do it anyways, they frequently come to me for help because I have more Linux desktop experience than is humanly justifiable.
> We're getting further and further away from your desktop.
My desktop has software on it.
> Wait.. so software got less expensive to develop.. and yet we're concerned about configuration options.. so they're being _reduced_ from what they _historically_ were?
In what way did software get less expensive to develop?
> This is just an exercise in goal post moving at this point.
The goal posts are in your head. I'm not playing some kind of game. You are.
Conversely, Windows.
Both in the sense that Win32 GUIs (introduced in Windows 95) are a bit out of style but are still supported and widely used; and Powershell being a major departure from traditional CLIs.
I guess that's what I get for being so terse - I actually meant it as a counterexample. Windows is a good illustration of what happens when you avoid deprecating older systems: you end up with multiple design philosophies and interfaces layered on top of each other, leading to a less cohesive user experience overall.
Canonical, the company behind Ubuntu, could easily port the remaining Qt5 apps to Qt6 by the deadline. But they rather wait for the community to do it for them.
I think you overestimate Canonical's size.
What is happening though is really long support for older releases being provided by Canonical so you will be able to use older Qt 5 based apps on Ubuntu, security supported, until 2034, on Canonical's dime. With container technology (something that Canonical significantly funded, with Docker originally being based on Canonical-funded LXC) you'll be able to run those older apps on a newer base OS including future ones, too.
Those are huge contributions to the user community you seem to be discounting.
Disclosure: I work for Canonical. I don't speak for Canonical. But what I say above are simple verifiable and falsifiable facts and not really a matter of opinion.
Seems like the root misstep is when extending server support to 10 years they extended GUI app support as well. The two probably shouldn't be linked and GUI support should end with the desktop edition at five years.
What about the other distros? Are they waiting on the community to do it for them?
I suspect that means waiting for Debian to do it.
Porting them out of tree would be worse than doing nothing.
clickbaity title. Makes it sound like Qt being removed.
Better might be:
Ubuntu Hoping to Remove Qt 5 in favor of Qt 6 Before Ubuntu 26.04 LTS
or maybe:
Ubuntu will try moving from Qt 5 to Qt 6 by Ubuntu 26.04 LTS
I am hoping KeepassXC will move to Qt6 soon.
> I am hoping KeepassXC will move to Qt6 soon.
Let's also hope this will happen before Qt7 will come out.
Kind of sounds like Canonical and Qt are in cahoots, in a similar way that Qt and openembedded are.
I sure don’t like Canonical.
I sure as hell would prefer to not have to maintain Qt 5 for 13 more years if I could avoid it. I don't think this has anything to do with "being in cahoots with the Qt company", I think it has everything to do with "backporting fixes from a recent Qt 6 version to an older Qt 6 version is easier than backporting fixes from Qt 6 to Qt 5".
I’m all for qt6. I doubt I need to explain to you the licensing change.
[dead]
I have always avoided Qt for any projects because of the incomprehensible licensing model that could very easily create a legal minefield for your product.
No. Thanks.
It's just LGPL if you make any standard desktop app. Like you can make pretty much the entirety of KDE with the LGPL parts of Qt.