Coupling the updates of single apps with the updates of the whole desktop or framework and libs, is just plain wrong. Having to upgrade the whole distro (including all the other installed apps you dont want to upgrade) just to install a new version of one single app you _want_ to update is a nightmare. Total bullshit. Users. Dont. Want. That. Users dont want one update to trigger another update, or even to trigger the upgrade of the whole desktop.
The blog post by ESR is one prominent example: http://esr.ibiblio.org/?p=3822
He basically wanted to upgrade just one (obscure) app, and the process triggered the automatic removal of Gnome2 and installation of Unity. Just _IMAGINE_ how nightmarish this must look for normal users. You simply dont remove somebodys installed desktop sneakily from under their feet. You simply dont. That feels like the total loss of control over your computer.
I personally had, during the last 10 years, people go from Linux (which I talked them into trying) back to windows, _precisely_ of this reason, of having to upgrade the whole distribution every few months just to be able to get new app versions. They dont have to put up with this insane bullshit on Windows, why should they put up with it on Linux?
This "distribution" bullshit is not what is killing desktop Linux, it is what _already_ killed desktop Linux.
The other reasons why desktop Linux never made it (no games, no preinstallations on hardware) are imho just consequences of the distribution concept and the 6-month planned-obsolescence cycle. Nobody wants to bother with something which will be obsolete half a year down the road. Nobody wants to develop for a target that moves _that_ fast.
Windows installations, once installed or preinstalled, run for a decade. Develop something, and it will run on a 10 yr old Windows your grandparents use. Most people encounter new Windows installations only when they buy a new computer. PC manufacturers know that customers will hate it when their new computer OS is obsolete within half a year and that they wont be able to install new apps, so they dont preinstall Linux, it's as simple as that.
If anybody _ever_ really wants to see Linux succeed on the desktop (before the desktop concept itself is gone), he will have to give up on the distribution concept first.
Even as a Linux nerd I'm constantly faced with problems caused by this. I'm stuck on Ubuntu Natty, for example, because it has the last stable version of Compiz that worked for me on the flgrx ATI drivers. If I wanted the latest version of Unity (I don't, I think Unity is terrible, but this is just an example) that means I'd have to upgrade Compiz and everything else and get stuck with all the horrible bugs new Compiz versions have with my drivers. It would also mean upgrading to Gnome 3 which still has many usability regressions (unrelated to Gnome Shell) and in my opinion isn't fully baked yet. I don't want all that shit just to get the latest version of a single package!
(You could perhaps pull it off with some PPA mumbo-jumbo, but you'd have to understand what a PPA is, luck out in finding a PPA in the first place, and messing with them can more than likely bork something up. Not something for the average-Joe audience Ubuntu is targeting.)
Shared libraries made sense in the days of limited space and resources. They still make sense from a few security perspectives. But from a practical perspective, Windows did it right by allowing programs to ship their own libraries and by doing backwards-compatibility right.
I feel like even the original 1984 Macintosh was a better alternative. Each app is contained in one file; it has no dependencies other than the OS. There is no need for installers/uninstallers. For updates, even Sparkle (annoying as it is) at least doesn't disrupt your whole system.
There are Linux distros that try to work this way, but the upstream developers have become lazy; they expect distros to do packaging for them. This leads to some skewed incentives; I think everything works better when the developer is also responsible for packaging, marketing, and support.
I personally LOVE LOVE LOVE that all my apps are updated by the same program. Instead of the Windows/Mac way of each app running it's own updater.
However, I do agree that the UI for upgrading a single app should be made better.
Just this week, I finally moved out of Unity while staying with Ubuntu 11.10. My workaround is to move to XFCE (Xubuntu). So far, no problems. Btw, I have Compiz enabled with the proprietary nvidia drivers. You might want to try Xubuntu.
It's amazing how easy it was to take out Unity and replace it with xubuntu: sudo apt-get install xubuntu-desktop.
The xubuntu people have taken the trouble to interface with all the Ubuntu plumbing (networking, sound, updates etc) via XFCE. Overall I'm impressed with the Ubuntu ecosystem.
In fact you're precisely illustrating my point with your example. That people can choose to stick to XP because they don't like 7 is precisely analogous to wanting to stick with Gnome 2 because I dislike Unity. But the difference is that I can do that in Windows because I can still get app updates in XP for over a decade, but I can't in Linux because app updates are tied to the entire massive distro and everything else that goes with it... often every 6 months.
It certainly is possible to upgrade parts of the system using esoteric (to the average-Joe) solutions like PPAs or compiling from source or being forced to switch to a different desktop environment. (Most users don't even know what a DE is!) But the fact that you're forced to turn to such solutions is what I--and the OP and the linked article--argue is the biggest nail in the desktop Linux coffin.
OSX/iOS are much more like Ubuntu -- practically any cutting-edge app requires something near to the latest OS release.
Android is suffering precisely because there is low OS upgrade adoption, as I've heard recently right here on HN.
I'm not 100% sure about it. As it happens I run both Ubuntu 10.10 (at work) and Ubuntu 11.10. I just saw an alert to upgrade my 10.10 Ubuntu to Firefox 11. I give them major props for keeping an major package like Firefox updated. They probably don't do this for other packages.
Now wrt app updates, people have pointed out downthread that PPAs can be added to the system by clicking a specially-encoded URL in Firefox). People on Windows also do things like download the file, locate it, double click, respond to a UAC prompt etc. So installing through PPA isn't that much more complicated. IMHO.
Also Ubuntu has a software store, to which developers are free to push their updates.
In summary, what people are wishing for in this thread is to have the 3-4 apps they use upgraded to the latest and greatest versions while leaving the core OS and other apps at baseline versions. But ... the solution of using PPA (analogous to downloading the app in Windows) is deemed too nerdy ... Leading to the conclusion that the distro should take responsibility for maintain all the 10s of 1000s of packages for ~10 years. Am I understanding you correctly? Probably ain't gonna happen.
Do you really think it would be a step forward if we got rid of the huge number of non-core packages in distros and had to hunt down PPA sources (again, equivalent to downloading apps from different websites in Windows) for each program. I'm not sure.
Continuing, I'm not sure the lack of updates for old releases is only a technical problem: XP has an installed base, so people jump through ALL kinds of hoops to make their programs work there. IIRC, there are major differences in audio/video/drivers between Win7 and XP, but people (e.g., Flash/Chrome) still do it.
I doubt developers will keep their apps updated and working on XP if it didn't have such a large installed base.
End long reply. :)
PPAs still aren't the best solution even if distros make them easy-to-use. The reason is that now maintainers have to get their software into two places: the core repos and now an optional up-to-date PPA. Additionally they must still target those PPAs to every single new version of the entire distro. For example: I run Natty because I like Compiz, but Oneiric has showstopper Compiz bugs for my ATI card. On Natty I use a PPA that downgrades the version of Compiz to a more stable one. However that PPA has not published an update for Oneiric, so I can't upgrade to Oneiric and get other new packages. What now? My fate is in the hands of a single PPA maintainer.
Or: many PPAs only publish releases for the past 2 or 3 distro releases. What if I'm on a non-LTS release and don't want to upgrade (because upgrading would bring in more unwanted packages), but the PPAs I use no longer publish for my version? I'm out of the game again.
The core problem in these scenarios is the distro model. Devs have to keep their apps up to date with the quickly-moving target of a "distro." That's just not a situation anyone should be in. The OP and the linked article explain this better than I can.
I carefully read what you wrote, and IMHO your requirements are not reasonable: in effect, you want the non-LTS releases of distros to vanish so that app providers don't have the temptation to not support previous releases.
Maybe I'm a what somebody called a "technologist" downthread.
That is simply not the case with Linux today no matter how you frame it. And for some reason too many Linux supporters are totally blind to that because they think package managers are flawless gems of convenience. They mistake package managers for convenience when in reality they're a band-aid for a situation that shouldn't exist in the first place, and that other OS's have solved better. The OP calls this out perfectly.
It's easy to imagine a similar problem (somebody stuck on XP because for e.g., a device driver exists only for XP). For instance, an old laptop of mine has a webcam driver only for Vista but not Windows 7.
No, they do not have the same exact problem of package managing. Preferring XP over Vista/7 whatever, is not the "exact same problem".
>I personally LOVE LOVE LOVE that all my apps are updated by the same program. Instead of the Windows/Mac way of each app running it's own updater.
Well, Mac now has the Mac App Store, which handles updates for App Store installed programs.
But the problem is not about all programs being installed by the same program. It's about installations being transparent to the end user and not wrecking havoc. Also, not being inter-dependent.
Even Mac apps that update with their own updater, are autonomous units. At most, the app itself breaks.
I very much agree with the idea of a slow-moving core, though. That's how I've tended to use Debian: a stable foundation upon which I can install less stable higher-level software (the things I actually interact with). Of course, this doesn't address the distro fragmentation problem Ingo is talking about.
Optionally bundling specific versions of libs (or compiling in statically), and placing in user's home directory, setting path to look there first (maybe that's what you're saying exactly?)
Stop using the same tool for updating userland apps and system core specific stuff. Same app for updating "/bin/ls" and for "audacity" is, imo, at the core of the brokenness. These are different types of apps with different areas of responsibility, but we lump them all together in one tool and process.
You can use one tool and still separate different sorts of packages into different areas of responsibility. A few distros do that, or at least are capable of it. Arch has AUR. Gentoo has a bunch of 3rd party overlays.
Having multiple distributions is fine; someone just needs to make one with a package manager that gets this stuff right. Then your parents can just use that one distribution and never need to care about what other distros do.
In fact, Ubuntu wants to be that one distribution that is easy for normals. But as you illustrated, it still falls short...
Distributions freeze the set of available app versions to a specific version of the base system. You cant get a new app without upgrading everything else too, including all other apps. I cant simply get a new Emacs on my Ubuntu, because to do that, Ubuntu also wants to remove my Gnome2 and replace it with Unity.
Windows decouples the base from the apps. When you want to update an app, you just do it, everything else remains untouched. If I want to get a new Emacs on XP, it wont force me to simultaneously upgrade to 7.
Windows would have the same problems if they bundled a new set of base libs and 20000 apps that only work with that specific set of libs every few months but they dont. They invest a great effort into making the base system slow and stable and support it for a decade or more, and it shows.
I.e. source-based distributions (e.g. gentoo) don't have this particular problem.
And I don't see what's wrong with requiring users to have an up-to-date system. Security fixes alone make regular updates almost mandatory on all operating systems. Windows installs run for a decade, but they still get constant security upgrades, including ones that require restarts and big scary service packs.
System upgrades simply should be totally painless. The kernel gets updated constantly without users noticing, it should be the same for all upgrades. Maybe a rolling release would be a better solution, because it gets rid of the scary system upgrade user interaction, but they're more difficult to QA.
Maybe the repository administration model needs to be changed. Giving the devs more control/responsibility for their package in the repository might be a good idea. Many developers already set up PPAs to get there.
The problem are not the forced invisible security updates, the problem are forced user-visible upgrades.
When you want to upgrade one app, you maybe dont want to simultaneously upgrade another app or even the whole desktop. With the distribution model, theres no way to avoid this forced interdependence.
Average users should have no say in keeping their apps from getting auto upgraded. Linux distros have to track upstream app releases because if they don't there will be breakage eventually. Some app will require a feature added in lib X version Y, and they're still on Y-2. If the packages aren't upgraded, users will complain when they can't install newer packages.
"Some app will require a feature added in lib X version Y, and they're still on Y-2."
Windows has had this solved for something like a decade. Sure, there's the much lampooned "dll hell", but honestly, Linux's solution was "lol lets upgrade things and break user apps".
There is zero excuse for apps in Linux to have library dependency issues. A package, when downloaded, should have its depended-upon libs and so's tagged. When some other application is updated and pulls in new version of the libs, the first app shouldn't ever see the update. Wouldn't that be nice?
Similarly, having a stable ABI to program against for system calls would be helpful. Users complain when their old apps break, and this is unavoidable under the current Linux development model (see http://primates.ximian.com/~miguel/texts/linux-developers.ht... for a good article on this problem).
Linux distros would do well to implement something similar. Disk space and RAM are cheap, having a few different versions of the same DLLs is no big deal. I don't remember the last time I had a .dll problem in Windows post Vista, whereas I still run into .so issues nearly constantly in Linux distros.
Ingo is wrong about freedom. Freedom is the cause if Linux's problem: dev teams are too free to make compatibility breaking changes and too many alternatives in core desktop infrastructure, so QA can't keep up.
That practically requires that the free teaser product be 'unstable' (and therefore undesirable for paying customers). And the easiest way to do that is a top-to-bottom bleeding-edge system rebuild with each new release.
So it's not just a matter of "not enough QA", because there are very real scalability problems with re-QAing everything every six months to ensure that some random library or compiler flag change didn't break something.
Look at Debian for example - they very much get the idea of "freedom", but they also understand software deployment lifecycles and produce a long-term stable version. (One could argue with their management decisions, but the basic idea is correct.)
I want a working-for-me system. Breaking it to make it "up to date" is a bug.
You're assuming that an application or version that has been replaced by something newer is necessarily broken/inferior/etc. You're wrong.
Yes, some updates do address security, but many/most don't and even the ones that do don't necessarily apply in all situations. Not to mention that security isn't the only priority.
Security isn't the only priority offline, so why would anyone think that it would be online? Disagree as to online? Show me your car, residence, or person and I'll be happy to demonstrate.
One advantage of the distro model over the Windows model: you've never had a toolbar installed in your browser or your homepage changed because of a package you installed from a repo, have you? I find the Windows software ecosystem (at least the freeware portion of it) far more annoying than anything package managers do. Talk about unwanted changes to the user's computer.
The reason that everything is packaged together like that is so library updates and other things can be tested and vetted for compatibility. Part of the challenge of allowing such a variety of OS and system configurations is that library or software package maintainers are unlikely to have tested their updated code on every possible distribution out there, and so that makes it the responsibility of the distro maintainers.
Believe it or not in practice this doesn't create many issues. It might be an obstacle for casual users who are trying to transition into power users, and are trying to tweak their system. But most other users aren't encountering the same frustration.
How do you manage the 1000 packages and their libraries and dependencies? Many of them have separate runtimes which may or may not depend on the other packages runtime. How would you design an application sandbox to cover them all?
What you're saying basically is that linux is failed. At least for me, since I cant see another way to distribute and manage the massive amount of packages that sit on gnu.org.
I agree with you, and It is my opinion that the one who solves these problem is in for a lot of business-opportunities.
A small core set of libraries that change _very_ slowly and arent intentionally obsoleted every few months. Think of Windows like slow, stable and supported for a decade. Distribution of apps decoupled from the distribution of the base. Never make an app update trigger a lib update.
> How do you manage the 1000 packages and their libraries and dependencies?
You dont do that at all. Developers to that themselves like they do on Windows and OSX. Every dev packages his own app and puts it either into the App store or distributes it himself. You manage only the libs and dont allow them to change fast or in an uncoordinated, chaotic way.
> What you're saying basically is that linux is failed.
From the point of a normal user, yes. For a normal user, it is not an option. Everybody I personally know who tried it, went back. The main reason for most of them was the insanity of application management. (And lack of hardware drivers and games, but thats not Linux' fault.)
> since I cant see another way to distribute and manage the massive amount of packages that sit on gnu.org.
Decouple libs and apps. Dont change APIs and lib versions every few months. Make the base a very reliable and slow moving target. Dont force anybody to change everything every few damn months.
> It is my opinion that the one who solves these problem is in for a lot of business-opportunities.
The problem is already solved, at least under Windows and OSX. Thats why Windows and OSX get all the desktop business and Linux gets none.
This is the idea of the linux standard base. The concept was developed over a decade ago and it has failed to show real fruit.
> You dont do that at all. Developers to that themselves like they do on Windows and OSX. Every dev packages his own app and puts it either into the App store or distributes it himself. You manage only the libs and dont allow them to change fast or in an uncoordinated, chaotic way.
You can already do this with the package management systems. For example, each game in the humble indie bundle installs into it's own /opt directory, with it's own private copies of it's dependencies. It uses the package management system to hook into desktop menu updates, etc. As a user, it's been a nightmare for me. Half of the games don't run at all and I'm at a loss for how to fix them or get replacement libraries.
At the very least, developers used to Windows would understand the dynamics of it (which is probably a good idea if attracting commercial development is a goal).
I would say it is quite a bit less ambitious than something like the linux standard base.
I'm hoping Nix(OS) will take off eventually. I try to play with it from time to time, but I haven't made the investment yet to really contribute.
come to think of it, there is nothing in aptitude that prevents this kind of an approach (including the unique hash paths). It will get you to 75% of Nix functionality, which IMHO is good enough.
I dont think Redhat.or Ubuntu will want to do that though. Their business model revolves around the walles garden.
_Programs under PC-BSD are completely self-contained and self-installing, in a graphical format. A PBI file also ships with all the files and libraries necessary for the installed program to function, eliminating much of the hardship of dealing with broken dependencies and system incompatibilities. _
"you'd have duplication of files!"
"you'd have to update multiple libraries when there's a security patch!"
I didn't know about pcbsd, but I've espoused similar ideas to my linux friends years ago, and was generally met with the one of the two objections listed above. I think they're both bad arguments, but it's what I encountered the most.
Given that all this discussion largely revolves around open source projects anyway, if a developer didn't update for a new security patch in a library, someone would likely step up to the plate if it was a commonly used app. If it's a niche/minor app, and there's, say, a new version of libssl, if the author isn't making updates, there's no guarantee the app will work with an updated version of an upgraded shared library anyway.
But it'd still be up to the developer to update the library, no? Otherwise, how is that better than the current situation?
If it's a niche/minor app, and there's, say, a new version of libssl, if the author isn't making updates, there's no guarantee the app will work with an updated version of an upgraded shared library anyway.
But you don't have to upgrade the version of the library to release security updates: the Security team of Debian backports all security fixes to the library versions in Stable even if the upstream didn't, in order to prevent such breakage.
To avoid relying on app developers at all, put apps in sandboxes where it appears that the libraries are where they've always been.
Hmm, that's not exactly what happened, according to the link: "I upgraded to Ubuntu 11.04 a week or so back in order to get a more recent version of SCons."
Your overall point is well taken, but I wonder how much it affects what I think of as "normal users", who don't care so much about upgrading to the bleeding edge of scons. Consider a hypothetical user of Hardy, so they've had it for four years: what are they actually missing if what they do is web surfing, email, and maybe document editing?
Ubuntu's answer to that? Well, those aren't security fixes, so you can upgrade to $LATEST_RELEASE if you want the non-critical fixes. Ubuntu is trying to force a 2+ year bugfix cycle on software maintainers, and that's just not realistic for many small teams (both proprietary and open source).
This is a particular example, but I can think of other cases where this might be a problem. OpenOffice updates after a new MS Office release come to mind offhand.
You don't have to do all that to upgrade a single app. In fact you're thinking of it backwards. Distros mean that when you upgrade the OS or libs, you get new versions of the apps for free.
You can still configure && make && make install, or grab a statically-linked binary, or any other method of getting Linux apps to run.
You aren't serious, are you?
> statically-linked binary
Nobody builds them.
> any other method of getting Linux apps to run.
There are no other methods.
> Distros mean that when you upgrade the OS or libs, you get new versions of the apps for free.
That in turn means that if I dont want to upgrade the OS and the libs, I cant get new app versions. The collective refusal to acknowledge that this is a problem is what is holding back (aka killing) desktop Linux.
Edit: Central-repository installation methods I've used: python easy_install, perl CPAN, Ubuntu PPAs (which are a way to add 3rd-party apps to your package manager).
Without distros, the way to get software is unbearable for Windows converts, where it is just "click, click, done."
> what better way is there to avoid dependency hell?
All distros agreeing to ship one specific version of a lib, so that app devs can target that "standard" version instead of daily changing upstream versions.
The dependency chaos is a consequence of no distribution being influential enough (or the major players not being able to agree) to slow down the interdependent moving target that is the library space. So app devs dont care what distros ship and only target the upstream, and the upstream lib devs dont care about the overall ecosystem and just ship whenever they feel like shipping.
Nobody of them seems to care about the user experience of the end user, for whom getting on Linux seems like building on a shaky ground. And then they both pretend to not understand that a majority of end users would rather pay for Windows and have a decade of peace of mind and hassle free app availability, than moving for free to a earthquake prone area.
Sorry, what? On Windows machines, due to the lack of package management, installing software is not a matter of "click, click, done." You have to google the program, navigate to the website's download area, find the right link, download it, execute it. It asks if you're sure it isn't a Trojan or something, which you promptly ignore (and learn to ignore every other "are you sure you want to execute foo.doc.exe?" popup).
Package management ('done right') is a matter of "click, click, done." Installing software on Windows is an absolutely terrible experience.
There is an open source alternative which is not in your distro's repository, so you have to navigate to the website's download area, find the right link, download it, get accepted to university, take a year of 100-level CS courses, learn the command line on your own because you didn't get it at school, run configure and make, and it does not build on your distro. Not that it matters since the latest version 0.0.3 would not do what you needed if you had gotten it to run.
I think one of the main problems is that there is no single agreed way to distribute a program.
At least with Windows Installshield wizard kind of became the defacto.
Sometimes I go to a download page and they offer me a .deb file which is great but then I find it is targeted a specific version of ubuntu that I am not running.
Sometimes they offer a .rpm, or sometimes they just offer a zip file with a random .bin or .sh inside it that I may or may not have to run as root and may or may not do weird things to my system.
Also you get stuff that asks you to do git clone and make etc.
You talk as if lib versions only change to harass developers and end users. Sometimes there are security updates that need to be made. And how would lib devs add features if all the app developers and distros are frozen on an old version?
Evolution will never truly replicate the results possible with design.
I left my cousin with ArchLinux, he liked it alot, so much that he still used it 6 months later. At that time he wanted to install a new program, well, from then on he stopped using arch. Because he had to call me to help him fix his system, issuing the pacman -S programname failed, so he did pacman -Sy followed by pacman -S programname again, and this time his entire system was about to be updated, he answered yes on all questions... and suddenly his entire desktop was differnt from what it was before. All the programs get updated! He didnt want that!
This is people who think that a computer is broken if the taskbar on the desktop is accidently moved to another position by children. Of course they think they broke thier computer, and it did since their settings and gadgets on the plasma desktop stopped functioning, and some just changed. Without asking the user, just changed! That is a pure and simple WTF. Thats when I realized linux will never ever succeed on the desktop if it doesnt change fundamentally.
When you do pacman -Syu you should be looking on the front page for update news, you should be looking at the list of packages to be upgraded. It's not a distribution that will hold your hand, because there are some of us around who don't want our hands held.
Hell, I have to keep up with the mailing list and the news page on the website just to keep my OS working update to update.
It's worth it for me because of my needs but Arch is very very specifically NOT for the average user.
You basically set your cousin up to fail. And it's not like the branding and messaging isn't clear about any of these things. Why on earth did you pick Arch for this use case?
He tried Fedora earlier, we couldnt get it to work with his wireless-networking, SELinux always getting in the way as soon as he took his laptop with him somewhere, I got tired of pushing against a wall as I was becoming his support, and then he asked what Im using, how does it work for me. So Arch it was.
I totally understand where you are coming from, I love Arch and want to share how awesome it is from but all those problems were entirely based on a very poor suggestion for his use case. Arch doesn't "suffer" from this problem at all.
No. He wanted a new version of the app and upgraded to a new version of the OS (with a new set of default packages). And he got surprised by getting a new GUI, something which is rather odd because Unity was one of the most publicized features of Ubuntu.
> the process triggered the automatic removal of Gnome2 and installation of Unity
Not really. Gnome2 would still be there. Just the default UI is Unity. I'm more than a little bit surprised ESR had trouble remembering you switch UIs on login. I've been doing it since my Solaris (2.5) days. I loved OpenWindows.
> having to upgrade the whole distribution every few months just to be able to get new app versions
That's not really true - you have to do so because the distro publisher won't support the newest Chrome on their 2006 OS. It's ridiculous to demand them to spend their resources on your particular needs. If you are not happy, you can ask to have your money back. And even when the distro publisher doesn't want to add newer versions to an old OS, you can always add private repos maintained by the makers of your favorite software.
And, remember, having stable versions of software (even when a newer, flashier version, was made public) is not what some people want. I wan't my servers stable.
> This "distribution" bullshit is not what is killing desktop Linux
It was never much alive. Linux is an OS that suits a couple users well, but not most of them.
> are imho just consequences of the distribution concept and the 6-month planned-obsolescence cycle.
It usually took much longer to get a new version of your favorite Linux distro. 6 months is the current standard. And, again, there is no planned obsolescence. There are many alternative places to get newer versions for.
> Windows installations, once installed or preinstalled, run for a decade.
I don't believe we met, sir. Where planet are you from?
> If anybody _ever_ really wants to see Linux succeed on the desktop (...) he will have to give up on the distribution concept first.
I don't think so. In fact, most people don't think so. And, let me say that not thinking so works quite well.
You do realize the incredibly arrogant position you are taking. You purport to be the savior of the Linux desktop (do we need one, BTW?) and to have realized what's wrong with it and, best of all, you have the solution! Just do everything opposite to how it's been working for decades and all our problems will be solved.
Let me put it simply: when you think you are the dumbest person in a room, you are probably right. When you think you are the smartest person in a room, you are most probably wrong. And if you disagree with everybody else in the room, odds are you are really the dumbest person there.
Maintaining a distro is a lot of work, but until we can make software makers to agree on a single package format, a single way to manage configurations and a single way to organize the file hierarchy, the distro way will remain a very popular way to manage your computers.