Speaking as a user, not a dev, I don't really understand what you are referring to. There are "multiple" solutions, namely 2 (two). Both of them work on all distros, so I could not care less which one you pick.
Same for desktops, again there are two: gtk-whatever and qt-whatever. All software works on both of them, so again I don't care.
Same for installing software. Installing Wolfram Mathematica is literally downloading one file, clicking on it and pressing Enter a few times.
If anything iOS and its appstore showed that the repo model is great and there is a reason why microsoft developed/copied "winget".
As a user, you're missing details, and completely misunderstand several abstractions.
For instance, gtk and qt are not desktops. They're ui toolkits. Bunches of widgets that can be composed to cobble together a UI.
That's completely separate from the topic under discussion, which is mode of application delivery. FlatPak involves using namespacing and containers to "spin up" a virtual system with only visibility into those slices of the overall host needed to run the program.
This is good because it at least keeps unfamiliar software constrained somewhere predictable, but poor because often it won't use OS host libraries, which are generally kept the most up to date, and worst of all, enforces a complexity and debuggability tax where one has to have an intimate understanding of what is going on "in the box" if something goes wrong.
Other ways mentioned are the distro model, whereby distributions maintain repository ecosystems and make decisions with regard to FHS compliance, system tooling, update pipeline, etc..., and use those to support their user base.
There is the original "compile and stage it yourself" crowd, who basically concentrate on reproducible build capabilities, but tend to lack in automatic dependency resolution.
There's the Mac way, which isn't terrible. It tracks and designates places for libraries, Frameworks, and Applications, and has a lot of automated and well integrated ui-tooling which makes the user experience of software install easy, but it's just another paradigm you have to track when trying to write portable software.
There's static linking, which delivers executables that are entirely self-contained, but tend to be bigger memory footprint-wise, each have an upgrade path separate from every other executable, and benefit not at all from dynamic linked libraries on the host system.
Then finally, there is dynamically linked executables. They're small, commonly reused code is loaded once in memory you have the additional complexity of the linker to be aware of, but you can update the entire system's audience of a particular library at the same time... Which can be a double-edged sword given how the maintainer scripts the install or sets up their system/build environment.
Personally, I just sidestep the issue myself by digging into and learning about software I use on a regular basis with disassembly tools, and treat all of my systems the way a good farmer treats their livestock. Distant reverence, but with a careful eye as to whether there is something wrong, and a hard fought for willingness to kill something and start from scratch. It's the only way I've found to be truly safe and resilient in an environment where everyone optimizes for their own particular definition of convenience.
(My definition of convenience is a minimum difficulty in troubleshooting what might be wrong, so I favor fewer abstractions as that entails less Tower of Babel to wade through).
Full disclosure: my approach generally would classify me as a bit of a curmudgeon in the industry as I still approach computers as being analogs of physical machines. There is a healthy corpus that revels in abstraction, but I'm not really one of them. I like to have a ballpark understanding of what the artfully arranged beach sand is doing. Adding more abstractions or having the computer do things itself is generally not something I strive toward as it almost always comes with an unacceptable increase in the overall complexity inherent to navigating the Gulf between how I and everyone else thinks the system works and how it actually does.
More than anything else, in my experience, keeping that Gulf small leads to better overall satisfaction from a lay User. YMMV.
> For instance, gtk and qt are not desktops. They're ui toolkits.
And I never claimed that. Again, as a user the difference between gnome, cinnamon or unity is not all that interesting, so I simply wrote "gtk-whatever" as an umbrella term and "qt-whatever" for kde plasma and lxqt.
My basic question is the following:
- If I find software that is only available as a snap, I will use snap. It works on any distro.
-If I find it only available as a flatpak, I will use that. It works on any distro.
- If I find only an app image, I will use that. It works on any distro.
- I get an installer from some big commercial software. It unpacks itself in /opt or uses a docker image or... . Again works anywhere.
Why should I, as a user, care which distribution mechanism you chose if they all work anyways? Why is "fragmentation" an issue, if whichever you choose works everywhere regardless?
As a user, you will eventually be limited by the decision that developers/maintainers decide on.
Want to run your program, but I only support something for hardware that supports full client virtualization, which your system doesn't have? Oh well.
Oh, you wanted to run that Snap, but the cache is invalidated or cleaned, and no internet connection? Oh well. Sucks to be you.
Oh crap, you need to do that one final fix real quick in that one program, but oops! I just force pushed out a functionality breaking security update! Sorry bout it!
It may not seem so important from the casual user perspective, because you may not sample different configurations or distribution channels as devs/maintainers/admins, but you'll hit it one day, and you'll be as pissy as we are. Why doesn't this shit just work?
And the answer is, because it isn't magic, and it has to get to you somehow. Again, I hedge on the side of being deliverable with the simplest set of abstractions possible, and I'm old enough to remember when complex things actually came with manuals; so I prefer that delivery and understanding the ins and outs of the tool. You don't get that with FlatPak or Snap without cracking open the container which is far beyond most average computer users, and I place value on people being able to easily learn how to program, utilize, or reason about computers better, something that no container will ever teach you; especially given that in order to understand them you need to already be pretty deeply immersed in the mechanics of computing.
On the other hand, a zipped workspace with some tooling, READMEs and documentation can make for hours of discovery and familiarization with system fundamentals.
It's an approachability thing, and we've totally lost touch with it as a modern industry in my opinion.
Same for desktops, again there are two: gtk-whatever and qt-whatever. All software works on both of them, so again I don't care.
Same for installing software. Installing Wolfram Mathematica is literally downloading one file, clicking on it and pressing Enter a few times.
If anything iOS and its appstore showed that the repo model is great and there is a reason why microsoft developed/copied "winget".