Hacker News new | comments | show | ask | jobs | submit login

The reason I stopped recommending Linux to "normal users" is _because_ of the concept of distributions.

Coupling the updates of single apps with the updates of the whole desktop or framework and libs, is just plain wrong. Having to upgrade the whole distro (including all the other installed apps you dont want to upgrade) just to install a new version of one single app you _want_ to update is a nightmare. Total bullshit. Users. Dont. Want. That. Users dont want one update to trigger another update, or even to trigger the upgrade of the whole desktop.

The blog post by ESR is one prominent example: http://esr.ibiblio.org/?p=3822

He basically wanted to upgrade just one (obscure) app, and the process triggered the automatic removal of Gnome2 and installation of Unity. Just _IMAGINE_ how nightmarish this must look for normal users. You simply dont remove somebodys installed desktop sneakily from under their feet. You simply dont. That feels like the total loss of control over your computer.

I personally had, during the last 10 years, people go from Linux (which I talked them into trying) back to windows, _precisely_ of this reason, of having to upgrade the whole distribution every few months just to be able to get new app versions. They dont have to put up with this insane bullshit on Windows, why should they put up with it on Linux?

This "distribution" bullshit is not what is killing desktop Linux, it is what _already_ killed desktop Linux.

The other reasons why desktop Linux never made it (no games, no preinstallations on hardware) are imho just consequences of the distribution concept and the 6-month planned-obsolescence cycle. Nobody wants to bother with something which will be obsolete half a year down the road. Nobody wants to develop for a target that moves _that_ fast.

Windows installations, once installed or preinstalled, run for a decade. Develop something, and it will run on a 10 yr old Windows your grandparents use. Most people encounter new Windows installations only when they buy a new computer. PC manufacturers know that customers will hate it when their new computer OS is obsolete within half a year and that they wont be able to install new apps, so they dont preinstall Linux, it's as simple as that.

If anybody _ever_ really wants to see Linux succeed on the desktop (before the desktop concept itself is gone), he will have to give up on the distribution concept first.




I've been saying this same thing for years. Package managers are a nice concept in theory, but Linux on the desktop will never, ever succeed until upgrading (for example) Firefox doesn't result in an install of Unity. The entire concept of a milestone-based monolithic distro is so broken for desktop use that I can't believe a better alternative hasn't been developed yet.

Even as a Linux nerd I'm constantly faced with problems caused by this. I'm stuck on Ubuntu Natty, for example, because it has the last stable version of Compiz that worked for me on the flgrx ATI drivers. If I wanted the latest version of Unity (I don't, I think Unity is terrible, but this is just an example) that means I'd have to upgrade Compiz and everything else and get stuck with all the horrible bugs new Compiz versions have with my drivers. It would also mean upgrading to Gnome 3 which still has many usability regressions (unrelated to Gnome Shell) and in my opinion isn't fully baked yet. I don't want all that shit just to get the latest version of a single package!

(You could perhaps pull it off with some PPA mumbo-jumbo, but you'd have to understand what a PPA is, luck out in finding a PPA in the first place, and messing with them can more than likely bork something up. Not something for the average-Joe audience Ubuntu is targeting.)

Shared libraries made sense in the days of limited space and resources. They still make sense from a few security perspectives. But from a practical perspective, Windows did it right by allowing programs to ship their own libraries and by doing backwards-compatibility right.


The entire concept of a milestone-based monolithic distro is so broken for desktop use that I can't believe a better alternative hasn't been developed yet.

I feel like even the original 1984 Macintosh was a better alternative. Each app is contained in one file; it has no dependencies other than the OS. There is no need for installers/uninstallers. For updates, even Sparkle (annoying as it is) at least doesn't disrupt your whole system.

There are Linux distros that try to work this way, but the upstream developers have become lazy; they expect distros to do packaging for them. This leads to some skewed incentives; I think everything works better when the developer is also responsible for packaging, marketing, and support.


I understand where you're coming from, but people have the exact same problem with Windows (and maybe Mac also; but I don't have much experience with Macs). Many people are staying on XP because they dislike Windows 7/Vista. I suspect the same is happening with Ubuntu.

I personally LOVE LOVE LOVE that all my apps are updated by the same program. Instead of the Windows/Mac way of each app running it's own updater.

However, I do agree that the UI for upgrading a single app should be made better.

Just this week, I finally moved out of Unity while staying with Ubuntu 11.10. My workaround is to move to XFCE (Xubuntu). So far, no problems. Btw, I have Compiz enabled with the proprietary nvidia drivers. You might want to try Xubuntu.

It's amazing how easy it was to take out Unity and replace it with xubuntu: sudo apt-get install xubuntu-desktop.

The xubuntu people have taken the trouble to interface with all the Ubuntu plumbing (networking, sound, updates etc) via XFCE. Overall I'm impressed with the Ubuntu ecosystem.


We're talking about different problems. Windows does not have the problem I'm describing. If I want to update Firefox on Windows XP, I can do that without Windows forcing me to update to Windows 7 as part of the deal. That's the problem. Updating Firefox on Ubuntu 11.04 after a certain (very soon in the future) point in time necessitates an update to Ubuntu XX.YY.

In fact you're precisely illustrating my point with your example. That people can choose to stick to XP because they don't like 7 is precisely analogous to wanting to stick with Gnome 2 because I dislike Unity. But the difference is that I can do that in Windows because I can still get app updates in XP for over a decade, but I can't in Linux because app updates are tied to the entire massive distro and everything else that goes with it... often every 6 months.

It certainly is possible to upgrade parts of the system using esoteric (to the average-Joe) solutions like PPAs or compiling from source or being forced to switch to a different desktop environment. (Most users don't even know what a DE is!) But the fact that you're forced to turn to such solutions is what I--and the OP and the linked article--argue is the biggest nail in the desktop Linux coffin.


The fact that Windows tries valiantly to offer backwards compatibility (actually is forced to by their own business model) I have to assume is at least partially responsible for the overall huge suckage that is working on Windows.

OSX/iOS are much more like Ubuntu -- practically any cutting-edge app requires something near to the latest OS release.

Android is suffering precisely because there is low OS upgrade adoption, as I've heard recently right here on HN.


> If I want to update Firefox on Windows XP, I can do that without Windows forcing me to update to Windows 7 ... Updating Firefox on Ubuntu 11.04 after a certain (very soon in the future) point in time necessitates an update to Ubuntu XX.YY.

I'm not 100% sure about it. As it happens I run both Ubuntu 10.10 (at work) and Ubuntu 11.10. I just saw an alert to upgrade my 10.10 Ubuntu to Firefox 11. I give them major props for keeping an major package like Firefox updated. They probably don't do this for other packages.

Now wrt app updates, people have pointed out downthread that PPAs can be added to the system by clicking a specially-encoded URL in Firefox). People on Windows also do things like download the file, locate it, double click, respond to a UAC prompt etc. So installing through PPA isn't that much more complicated. IMHO.

Also Ubuntu has a software store, to which developers are free to push their updates.

In summary, what people are wishing for in this thread is to have the 3-4 apps they use upgraded to the latest and greatest versions while leaving the core OS and other apps at baseline versions. But ... the solution of using PPA (analogous to downloading the app in Windows) is deemed too nerdy ... Leading to the conclusion that the distro should take responsibility for maintain all the 10s of 1000s of packages for ~10 years. Am I understanding you correctly? Probably ain't gonna happen.

Do you really think it would be a step forward if we got rid of the huge number of non-core packages in distros and had to hunt down PPA sources (again, equivalent to downloading apps from different websites in Windows) for each program. I'm not sure.

Continuing, I'm not sure the lack of updates for old releases is only a technical problem: XP has an installed base, so people jump through ALL kinds of hoops to make their programs work there. IIRC, there are major differences in audio/video/drivers between Win7 and XP, but people (e.g., Flash/Chrome) still do it.

I doubt developers will keep their apps updated and working on XP if it didn't have such a large installed base.

End long reply. :)


Firefox is a special case. Ubuntu specifically updates that one package more frequently. Perhaps it was a bad example as that's one of the very few packages that is singled out for frequent updates by Canonical.

PPAs still aren't the best solution even if distros make them easy-to-use. The reason is that now maintainers have to get their software into two places: the core repos and now an optional up-to-date PPA. Additionally they must still target those PPAs to every single new version of the entire distro. For example: I run Natty because I like Compiz, but Oneiric has showstopper Compiz bugs for my ATI card. On Natty I use a PPA that downgrades the version of Compiz to a more stable one. However that PPA has not published an update for Oneiric, so I can't upgrade to Oneiric and get other new packages. What now? My fate is in the hands of a single PPA maintainer.

Or: many PPAs only publish releases for the past 2 or 3 distro releases. What if I'm on a non-LTS release and don't want to upgrade (because upgrading would bring in more unwanted packages), but the PPAs I use no longer publish for my version? I'm out of the game again.

The core problem in these scenarios is the distro model. Devs have to keep their apps up to date with the quickly-moving target of a "distro." That's just not a situation anyone should be in. The OP and the linked article explain this better than I can.


You want stability and a slowly-changing core system, but won't use the LTS releases? Because your app provider doesn't make updated PPA's for the LTS release. At which point you blame the existence of the distro's non-LTS releases? I think the correct target of your ire should be the app developer who doesn't produce PPAs for the LTS release.

I carefully read what you wrote, and IMHO your requirements are not reasonable: in effect, you want the non-LTS releases of distros to vanish so that app providers don't have the temptation to not support previous releases.

Maybe I'm a what somebody called a "technologist" downthread.


I want what Windows does: Stable core, independent apps. The entire concept of "LTS" is a red herring. It's a byproduct of the distro mindset. To put it another way: There is no Windows LTS. You buy one version of Windows and it works for a decade--a DECADE--and your apps are kept updated for as long as the developer cares to do so, often automatically in-app. (There's no Windows "app store", but as Apple demonstrates it's a matter of will, not technology, to make one.)

That is simply not the case with Linux today no matter how you frame it. And for some reason too many Linux supporters are totally blind to that because they think package managers are flawless gems of convenience. They mistake package managers for convenience when in reality they're a band-aid for a situation that shouldn't exist in the first place, and that other OS's have solved better. The OP calls this out perfectly.


Please do not blame "package managers": Windows Installer is a package manager in many ways comparable in scope to dpkg, which given your premise immediately disproves that they are the cause of the problem you are talking about; the issue is not package managers, it isn't even centralized package distribution systems like APT: it is that the ecosystem of libraries and protocols that make up the Linux desktop have horrible binary compatibility issues that distributions seem to make even worse through the usage of "rebuild the world and update all the dependency relationships while we are at it" policies.


Windows has which problems, exactly? His mentions are pretty Linux-specific and I couldn't find one that applied to Windows.


My first digital camera had drivers only for Windows 98. I had to dual boot until I got a CF card reader.


Could you elaborate on the jist of your point? I'm genuinely not getting it. Keep in mind than Windows 98 is a 15 year old, crappy legacy OS. And we were all running FVWM and AfterStep.


OP said: "I'm stuck on Ubuntu Natty, for example, because it has the last stable version of Compiz that worked for me on the flgrx ATI drivers."

It's easy to imagine a similar problem (somebody stuck on XP because for e.g., a device driver exists only for XP). For instance, an old laptop of mine has a webcam driver only for Vista but not Windows 7.

http://www.sevenforums.com/drivers/43156-dv6000-webcam-drive...


>I understand where you're coming from, but people have the exact same problem with Windows (and maybe Mac also; but I don't have much experience with Macs).

No, they do not have the same exact problem of package managing. Preferring XP over Vista/7 whatever, is not the "exact same problem".

>I personally LOVE LOVE LOVE that all my apps are updated by the same program. Instead of the Windows/Mac way of each app running it's own updater.

Well, Mac now has the Mac App Store, which handles updates for App Store installed programs.

But the problem is not about all programs being installed by the same program. It's about installations being transparent to the end user and not wrecking havoc. Also, not being inter-dependent.

Even Mac apps that update with their own updater, are autonomous units. At most, the app itself breaks.


On the other hand, being able to install software so easily with package management is a huge benefit of distributions. Perhaps the solution is to create a new package management system for a desktop distribution. A system which ensures that when upgrading an application only its personal namespace (libraries etc) changes, while other applications remain untouched. Libraries are second-class citizens, almost. That way we get the benefits of packages and libraries, but each application is logically isolated from changes to any others.

I very much agree with the idea of a slow-moving core, though. That's how I've tended to use Debian: a stable foundation upon which I can install less stable higher-level software (the things I actually interact with). Of course, this doesn't address the distro fragmentation problem Ingo is talking about.


Couple other ideas:

Optionally bundling specific versions of libs (or compiling in statically), and placing in user's home directory, setting path to look there first (maybe that's what you're saying exactly?)

Stop using the same tool for updating userland apps and system core specific stuff. Same app for updating "/bin/ls" and for "audacity" is, imo, at the core of the brokenness. These are different types of apps with different areas of responsibility, but we lump them all together in one tool and process.


They need to be lumped together into one update process. The package manager for each needs to be aware of the other so it can resolve incompatibilities and not overwrite files it shouldn't. Joe User is not going to run two different package managers to keep up with security updates. What you consider to be userland apps may be core system stuff to someone else. Package dependency graphs are not simple. Sometimes there are even loops, depending on what parameters you pass to ./configure

You can use one tool and still separate different sorts of packages into different areas of responsibility. A few distros do that, or at least are capable of it. Arch has AUR. Gentoo has a bunch of 3rd party overlays.


I agree with your complaints but I don't see how it is related to the concept of distributions.

Having multiple distributions is fine; someone just needs to make one with a package manager that gets this stuff right. Then your parents can just use that one distribution and never need to care about what other distros do.

In fact, Ubuntu wants to be that one distribution that is easy for normals. But as you illustrated, it still falls short...


> I don't see how it is related to the concept of distributions.

Distributions freeze the set of available app versions to a specific version of the base system. You cant get a new app without upgrading everything else too, including all other apps. I cant simply get a new Emacs on my Ubuntu, because to do that, Ubuntu also wants to remove my Gnome2 and replace it with Unity.

Windows decouples the base from the apps. When you want to update an app, you just do it, everything else remains untouched. If I want to get a new Emacs on XP, it wont force me to simultaneously upgrade to 7.

Windows would have the same problems if they bundled a new set of base libs and 20000 apps that only work with that specific set of libs every few months but they dont. They invest a great effort into making the base system slow and stable and support it for a decade or more, and it shows.


As I said. That's a problem with the package management in most distributions. It has nothing to do with the fact that we have multiple distributions.

I.e. source-based distributions (e.g. gentoo) don't have this particular problem.


Tell this to people who ended up with broken Arch installations after the sudden upgrade to Python 3.x.


That is a case of yet another pair of problems, though: Python is incompatible with itself across versions (understandable, I guess), yet it is hard to run multiple versions side by side (major flaw in Python 3 for reusing the unadorned name Python, instead of making the number part of the name or changing the name of the installed software)


Python 3 does not reuse the unadorned name "python". It installs itself as "python3" by default. The decision to have "python" be Python 3 was solely Arch's.


Eh. First of all, you're not limited to the packages your distribution supplies. Lots and lots of software is distributed in Launchpad PPAs. Other stuff can be installed without the package manager, some games are statically compiled, etc.

And I don't see what's wrong with requiring users to have an up-to-date system. Security fixes alone make regular updates almost mandatory on all operating systems. Windows installs run for a decade, but they still get constant security upgrades, including ones that require restarts and big scary service packs.

System upgrades simply should be totally painless. The kernel gets updated constantly without users noticing, it should be the same for all upgrades. Maybe a rolling release would be a better solution, because it gets rid of the scary system upgrade user interaction, but they're more difficult to QA.

Maybe the repository administration model needs to be changed. Giving the devs more control/responsibility for their package in the repository might be a good idea. Many developers already set up PPAs to get there.


> And I don't see what's wrong with requiring users to have an up-to-date system.

The problem are not the forced invisible security updates, the problem are forced user-visible upgrades.

When you want to upgrade one app, you maybe dont want to simultaneously upgrade another app or even the whole desktop. With the distribution model, theres no way to avoid this forced interdependence.


Updates are not invisible by default because the organizations behind the distros can't provide the same level of assurance that Microsoft or Apple can that update X won't break something.

Average users should have no say in keeping their apps from getting auto upgraded. Linux distros have to track upstream app releases because if they don't there will be breakage eventually. Some app will require a feature added in lib X version Y, and they're still on Y-2. If the packages aren't upgraded, users will complain when they can't install newer packages.


So, you've brought up another important point in the Linux/packagemanager ecosystem:

"Some app will require a feature added in lib X version Y, and they're still on Y-2."

Windows has had this solved for something like a decade. Sure, there's the much lampooned "dll hell", but honestly, Linux's solution was "lol lets upgrade things and break user apps".

There is zero excuse for apps in Linux to have library dependency issues. A package, when downloaded, should have its depended-upon libs and so's tagged. When some other application is updated and pulls in new version of the libs, the first app shouldn't ever see the update. Wouldn't that be nice?

Similarly, having a stable ABI to program against for system calls would be helpful. Users complain when their old apps break, and this is unavoidable under the current Linux development model (see http://primates.ximian.com/~miguel/texts/linux-developers.ht... for a good article on this problem).


Windows has even solved virtually all of "dll hell" via SxS.

Linux distros would do well to implement something similar. Disk space and RAM are cheap, having a few different versions of the same DLLs is no big deal. I don't remember the last time I had a .dll problem in Windows post Vista, whereas I still run into .so issues nearly constantly in Linux distros.


Linux has versioned libraries, but distros often ship only the latest versions. Libraries usually have filenames like "liblibrary.so.x.y.z", and an application will link to "liblibrary.so.x.y" or "liblibrary.so.x". Library maintainers also get lazy with making sure that the library stays compatible within major versions, or don't update the .so version properly.


Perfect username. The problem is that Linux ecosystem doesn't have enough QA backing the amount of new and changing code. The is partly caused by inflated egos of the competing distro teams.

Ingo is wrong about freedom. Freedom is the cause if Linux's problem: dev teams are too free to make compatibility breaking changes and too many alternatives in core desktop infrastructure, so QA can't keep up.


No, there are definitely underlying economic motives beyond ideological "freedom". The only way to make money in the Linux Distro world is to sell 'stability' ala RHEL.

That practically requires that the free teaser product be 'unstable' (and therefore undesirable for paying customers). And the easiest way to do that is a top-to-bottom bleeding-edge system rebuild with each new release.

So it's not just a matter of "not enough QA", because there are very real scalability problems with re-QAing everything every six months to ensure that some random library or compiler flag change didn't break something.

Look at Debian for example - they very much get the idea of "freedom", but they also understand software deployment lifecycles and produce a long-term stable version. (One could argue with their management decisions, but the basic idea is correct.)


> And I don't see what's wrong with requiring users to have an up-to-date system.

I want a working-for-me system. Breaking it to make it "up to date" is a bug.

You're assuming that an application or version that has been replaced by something newer is necessarily broken/inferior/etc. You're wrong.

Yes, some updates do address security, but many/most don't and even the ones that do don't necessarily apply in all situations. Not to mention that security isn't the only priority.

Security isn't the only priority offline, so why would anyone think that it would be online? Disagree as to online? Show me your car, residence, or person and I'll be happy to demonstrate.


Quite right. Aside from backports, you can get things like the latest Firefox and LibreOffice on Ubuntu 10.04 thanks to PPAs. To those complaining that the 6 month update cycle is a turn off to "normal" users: put an Ubuntu LTS on their computer with a few PPAs to keep specific packages up to date.

One advantage of the distro model over the Windows model: you've never had a toolbar installed in your browser or your homepage changed because of a package you installed from a repo, have you? I find the Windows software ecosystem (at least the freeware portion of it) far more annoying than anything package managers do. Talk about unwanted changes to the user's computer.


For software updates without updating the distro, backports can be turned on.

The reason that everything is packaged together like that is so library updates and other things can be tested and vetted for compatibility. Part of the challenge of allowing such a variety of OS and system configurations is that library or software package maintainers are unlikely to have tested their updated code on every possible distribution out there, and so that makes it the responsibility of the distro maintainers.

Believe it or not in practice this doesn't create many issues. It might be an obstacle for casual users who are trying to transition into power users, and are trying to tweak their system. But most other users aren't encountering the same frustration.


But what is the alternative to a non-distributions based gnu linux operating system?

How do you manage the 1000 packages and their libraries and dependencies? Many of them have separate runtimes which may or may not depend on the other packages runtime. How would you design an application sandbox to cover them all?

What you're saying basically is that linux is failed. At least for me, since I cant see another way to distribute and manage the massive amount of packages that sit on gnu.org.

I agree with you, and It is my opinion that the one who solves these problem is in for a lot of business-opportunities.


> But what is the alternative to a non-distributions based gnu linux operating system?

A small core set of libraries that change _very_ slowly and arent intentionally obsoleted every few months. Think of Windows like slow, stable and supported for a decade. Distribution of apps decoupled from the distribution of the base. Never make an app update trigger a lib update.

> How do you manage the 1000 packages and their libraries and dependencies?

You dont do that at all. Developers to that themselves like they do on Windows and OSX. Every dev packages his own app and puts it either into the App store or distributes it himself. You manage only the libs and dont allow them to change fast or in an uncoordinated, chaotic way.

> What you're saying basically is that linux is failed.

From the point of a normal user, yes. For a normal user, it is not an option. Everybody I personally know who tried it, went back. The main reason for most of them was the insanity of application management. (And lack of hardware drivers and games, but thats not Linux' fault.)

> since I cant see another way to distribute and manage the massive amount of packages that sit on gnu.org.

Decouple libs and apps. Dont change APIs and lib versions every few months. Make the base a very reliable and slow moving target. Dont force anybody to change everything every few damn months.

> It is my opinion that the one who solves these problem is in for a lot of business-opportunities.

The problem is already solved, at least under Windows and OSX. Thats why Windows and OSX get all the desktop business and Linux gets none.


> A small core set of libraries that change _very_ slowly and arent intentionally obsoleted every few months. Think of Windows like slow, stable and supported for a decade. Distribution of apps decoupled from the distribution of the base. Never make an app update trigger a lib update.

This is the idea of the linux standard base. The concept was developed over a decade ago and it has failed to show real fruit.

> You dont do that at all. Developers to that themselves like they do on Windows and OSX. Every dev packages his own app and puts it either into the App store or distributes it himself. You manage only the libs and dont allow them to change fast or in an uncoordinated, chaotic way.

You can already do this with the package management systems. For example, each game in the humble indie bundle installs into it's own /opt directory, with it's own private copies of it's dependencies. It uses the package management system to hook into desktop menu updates, etc. As a user, it's been a nightmare for me. Half of the games don't run at all and I'm at a loss for how to fix them or get replacement libraries.


I was always surprised that no one made something like "The Linux Binary Platform 2003" and then kept it up to date, maybe releasing another one later.

At the very least, developers used to Windows would understand the dynamics of it (which is probably a good idea if attracting commercial development is a goal).

I would say it is quite a bit less ambitious than something like the linux standard base.


It's been proposed at various times in various ways. It's largely a competitive issue - the for-profit mechanism boils down to selling "certified" enterprise configurations, so there's little intrest in making the platform even more commodified than it already is.


0-install (http://0install.net/) or even better - Nix packaging system (http://nixos.org/nix/ - they claim to be a purely functional package manager)


Damn straight! I don't understand why so few people know about Nix.

I'm hoping Nix(OS) will take off eventually. I try to play with it from time to time, but I haven't made the investment yet to really contribute.


maybe if someone would build a tool that pulls from Ubuntu repository and recreates an equivalent nix package.

come to think of it, there is nothing in aptitude that prevents this kind of an approach (including the unique hash paths). It will get you to 75% of Nix functionality, which IMHO is good enough.

I dont think Redhat.or Ubuntu will want to do that though. Their business model revolves around the walles garden.


Dang... I'm going to go download the NixOS live image, now.


Is there a linux equivalent of PC-BSD[1]?:

_Programs under PC-BSD are completely self-contained and self-installing, in a graphical format. A PBI file also ships with all the files and libraries necessary for the installed program to function, eliminating much of the hardship of dealing with broken dependencies and system incompatibilities. _

[1]:http://www.pcbsd.org/index.php?option=com_zoo&view=item&...


but but but....

"you'd have duplication of files!" or "you'd have to update multiple libraries when there's a security patch!"

I didn't know about pcbsd, but I've espoused similar ideas to my linux friends years ago, and was generally met with the one of the two objections listed above. I think they're both bad arguments, but it's what I encountered the most.


The problem isn't that you have to update multiple libraries - it's that you have to rely on each application developer to release an update when there's a patch for one of its libraries. That simply won't happen in many cases.


for static compilation, true. if required libraries were bundled with an app, you could replace that one in that location specific to that app and be done, assuming that the app didn't need any extra work done to it to support the new library.

Given that all this discussion largely revolves around open source projects anyway, if a developer didn't update for a new security patch in a library, someone would likely step up to the plate if it was a commonly used app. If it's a niche/minor app, and there's, say, a new version of libssl, if the author isn't making updates, there's no guarantee the app will work with an updated version of an upgraded shared library anyway.


for static compilation, true. if required libraries were bundled with an app, you could replace that one in that location specific to that app and be done, assuming that the app didn't need any extra work done to it to support the new library.

But it'd still be up to the developer to update the library, no? Otherwise, how is that better than the current situation?

If it's a niche/minor app, and there's, say, a new version of libssl, if the author isn't making updates, there's no guarantee the app will work with an updated version of an upgraded shared library anyway.

But you don't have to upgrade the version of the library to release security updates: the Security team of Debian backports all security fixes to the library versions in Stable even if the upstream didn't, in order to prevent such breakage.


The library developer updates the library, and if the app doesn't need changes to work with the new library version, he doesn't have to do anything. You could even automate finding all copies of the library and updating them...and keep prior versions around if something breaks.

To avoid relying on app developers at all, put apps in sandboxes where it appears that the libraries are where they've always been.


> He basically wanted to upgrade just one (obscure) app, and the process triggered the automatic removal of Gnome2 and installation of Unity. Just _IMAGINE_ how nightmarish this must look for normal users. You simply dont remove somebodys installed desktop sneakily from under their feet. You simply dont. That feels like the total loss of control over your computer.

Hmm, that's not exactly what happened, according to the link: "I upgraded to Ubuntu 11.04 a week or so back in order to get a more recent version of SCons."

Your overall point is well taken, but I wonder how much it affects what I think of as "normal users", who don't care so much about upgrading to the bleeding edge of scons. Consider a hypothetical user of Hardy, so they've had it for four years: what are they actually missing if what they do is web surfing, email, and maybe document editing?


Let's say they use instant messenger services, so they have Pidgin installed. I'm pretty sure there have been critical bugfixes in Pidgin, and the old Hardy version of Pidgin was so flaky for me 2 years ago (back before my workplace had moved to Lucid) that I installed my own from source.

Ubuntu's answer to that? Well, those aren't security fixes, so you can upgrade to $LATEST_RELEASE if you want the non-critical fixes. Ubuntu is trying to force a 2+ year bugfix cycle on software maintainers, and that's just not realistic for many small teams (both proprietary and open source).

This is a particular example, but I can think of other cases where this might be a problem. OpenOffice updates after a new MS Office release come to mind offhand.


From my perspective I don't have any trouble with Ubuntu using a combination of PPAs and Win/OSX like self contained apps. Things like Chrome and Mendeley come with auto-updating PPAs. Other apps that I really care about like Eclipse, Matlab or Firefox come in there own self contained packages.


Coupling the updates of single apps with the updates of the whole desktop or framework and libs, is just plain wrong.

You don't have to do all that to upgrade a single app. In fact you're thinking of it backwards. Distros mean that when you upgrade the OS or libs, you get new versions of the apps for free.

You can still configure && make && make install, or grab a statically-linked binary, or any other method of getting Linux apps to run.


> You can still configure && make && make install

You aren't serious, are you?

> statically-linked binary

Nobody builds them.

> any other method of getting Linux apps to run.

There are no other methods.

> Distros mean that when you upgrade the OS or libs, you get new versions of the apps for free.

That in turn means that if I dont want to upgrade the OS and the libs, I cant get new app versions. The collective refusal to acknowledge that this is a problem is what is holding back (aka killing) desktop Linux.


You said you stopped pointing people toward Linux because of the distros. But distros only add ways to get software. You don't have to use their package management at all. If you want to install a new version of an app, with or without the help of the distro, what better way is there to avoid dependency hell?

Edit: Central-repository installation methods I've used: python easy_install, perl CPAN, Ubuntu PPAs (which are a way to add 3rd-party apps to your package manager).


> But distros only add ways to get software.

Without distros, the way to get software is unbearable for Windows converts, where it is just "click, click, done."

> what better way is there to avoid dependency hell?

All distros agreeing to ship one specific version of a lib, so that app devs can target that "standard" version instead of daily changing upstream versions.

The dependency chaos is a consequence of no distribution being influential enough (or the major players not being able to agree) to slow down the interdependent moving target that is the library space. So app devs dont care what distros ship and only target the upstream, and the upstream lib devs dont care about the overall ecosystem and just ship whenever they feel like shipping.

Nobody of them seems to care about the user experience of the end user, for whom getting on Linux seems like building on a shaky ground. And then they both pretend to not understand that a majority of end users would rather pay for Windows and have a decade of peace of mind and hassle free app availability, than moving for free to a earthquake prone area.


> Without distros, the way to get software is unbearable for Windows converts, where it is just "click, click, done."

Sorry, what? On Windows machines, due to the lack of package management, installing software is not a matter of "click, click, done." You have to google the program, navigate to the website's download area, find the right link, download it, execute it. It asks if you're sure it isn't a Trojan or something, which you promptly ignore (and learn to ignore every other "are you sure you want to execute foo.doc.exe?" popup).

Package management ('done right') is a matter of "click, click, done." Installing software on Windows is an absolutely terrible experience.


I think you're conflating two different issues. Poor UX for installing 3rd party software has been solved with things like Steam or the Mac App Store, for example.


Steam and the App Store are nothing more than proprietary package management systems.


Again, that's not the issue. Package managers are great. The problem is how the base system is maintained/updated/versioned.


And on Linux, you navigate to the website's download area and find out they don't offer a Linux version.

There is an open source alternative which is not in your distro's repository, so you have to navigate to the website's download area, find the right link, download it, get accepted to university, take a year of 100-level CS courses, learn the command line on your own because you didn't get it at school, run configure and make, and it does not build on your distro. Not that it matters since the latest version 0.0.3 would not do what you needed if you had gotten it to run.


hehe , Harsh.

I think one of the main problems is that there is no single agreed way to distribute a program.

At least with Windows Installshield wizard kind of became the defacto.

Sometimes I go to a download page and they offer me a .deb file which is great but then I find it is targeted a specific version of ubuntu that I am not running. Sometimes they offer a .rpm, or sometimes they just offer a zip file with a random .bin or .sh inside it that I may or may not have to run as root and may or may not do weird things to my system.

Also you get stuff that asks you to do git clone and make etc.


All distros agreeing to ship one specific version of a lib, so that app devs can target that "standard" version instead of daily changing upstream versions.

You talk as if lib versions only change to harass developers and end users. Sometimes there are security updates that need to be made. And how would lib devs add features if all the app developers and distros are frozen on an old version?


It doesn't work that way. "Linux", even GNU/Linux, is not really a single operating system. It's an anarchically-designed, anarchically-run OS-building kit revolving around a vision of what an ideal operating system would look like and with each part adjusting to what each other part is doing on the fly.

Evolution will never truly replicate the results possible with design.


it's just Ubuntu, not all Linux. unless there are other distros with Unity and 6-month cycles I don't know of.


ArchLinux suffers from the same problem.

I left my cousin with ArchLinux, he liked it alot, so much that he still used it 6 months later. At that time he wanted to install a new program, well, from then on he stopped using arch. Because he had to call me to help him fix his system, issuing the pacman -S programname failed, so he did pacman -Sy followed by pacman -S programname again, and this time his entire system was about to be updated, he answered yes on all questions... and suddenly his entire desktop was differnt from what it was before. All the programs get updated! He didnt want that!

This is people who think that a computer is broken if the taskbar on the desktop is accidently moved to another position by children. Of course they think they broke thier computer, and it did since their settings and gadgets on the plasma desktop stopped functioning, and some just changed. Without asking the user, just changed! That is a pure and simple WTF. Thats when I realized linux will never ever succeed on the desktop if it doesnt change fundamentally.


ArchLinux is a very poor distribution to leave with someone who doesn't have the knowledge to fix these problems.

When you do pacman -Syu you should be looking on the front page for update news, you should be looking at the list of packages to be upgraded. It's not a distribution that will hold your hand, because there are some of us around who don't want our hands held.


The idea that there are interfaces for expert users (or that there should be) is anathema on HN these days.


It's a pity, I can't imagine I'm the only person in the world who shudders at the idea that my computer should be made 'easy' for me (at the expense of flexibility and power). I'm a type of customer, just as much as the so called 'average user' is.


You left your cousin with arch installed but didn't tell him that it's necessary to keep a rolling release distro up to date?

Hell, I have to keep up with the mailing list and the news page on the website just to keep my OS working update to update.

It's worth it for me because of my needs but Arch is very very specifically NOT for the average user.

You basically set your cousin up to fail. And it's not like the branding and messaging isn't clear about any of these things. Why on earth did you pick Arch for this use case?


Because what else can I pick? Ubuntu!? Ugh.

He tried Fedora earlier, we couldnt get it to work with his wireless-networking, SELinux always getting in the way as soon as he took his laptop with him somewhere, I got tired of pushing against a wall as I was becoming his support, and then he asked what Im using, how does it work for me. So Arch it was.


I suggest Linux Mint in that case. Ubuntu based, but no unity.

I totally understand where you are coming from, I love Arch and want to share how awesome it is from but all those problems were entirely based on a very poor suggestion for his use case. Arch doesn't "suffer" from this problem at all.


have you heard of Debian?


Fedora is having similar indigestion with the GNOME 3 upgrade, and longer then 6-month cycles are even worse. How would you feel if all the apps on your system were 18 months old and the only way to upgrade them was to upgrade the whole OS? I believe that's called "enterprise Linux".


These kinds of package distribution schemes work well for clusters of homogenous servers but I agree that they just get in the way for a typical desktop user.


> He basically wanted to upgrade just one (obscure) app

No. He wanted a new version of the app and upgraded to a new version of the OS (with a new set of default packages). And he got surprised by getting a new GUI, something which is rather odd because Unity was one of the most publicized features of Ubuntu.

> the process triggered the automatic removal of Gnome2 and installation of Unity

Not really. Gnome2 would still be there. Just the default UI is Unity. I'm more than a little bit surprised ESR had trouble remembering you switch UIs on login. I've been doing it since my Solaris (2.5) days. I loved OpenWindows.

> having to upgrade the whole distribution every few months just to be able to get new app versions

That's not really true - you have to do so because the distro publisher won't support the newest Chrome on their 2006 OS. It's ridiculous to demand them to spend their resources on your particular needs. If you are not happy, you can ask to have your money back. And even when the distro publisher doesn't want to add newer versions to an old OS, you can always add private repos maintained by the makers of your favorite software.

And, remember, having stable versions of software (even when a newer, flashier version, was made public) is not what some people want. I wan't my servers stable.

> This "distribution" bullshit is not what is killing desktop Linux

It was never much alive. Linux is an OS that suits a couple users well, but not most of them.

> are imho just consequences of the distribution concept and the 6-month planned-obsolescence cycle.

It usually took much longer to get a new version of your favorite Linux distro. 6 months is the current standard. And, again, there is no planned obsolescence. There are many alternative places to get newer versions for.

> Windows installations, once installed or preinstalled, run for a decade.

I don't believe we met, sir. Where planet are you from?

> If anybody _ever_ really wants to see Linux succeed on the desktop (...) he will have to give up on the distribution concept first.

I don't think so. In fact, most people don't think so. And, let me say that not thinking so works quite well.

You do realize the incredibly arrogant position you are taking. You purport to be the savior of the Linux desktop (do we need one, BTW?) and to have realized what's wrong with it and, best of all, you have the solution! Just do everything opposite to how it's been working for decades and all our problems will be solved.

Let me put it simply: when you think you are the dumbest person in a room, you are probably right. When you think you are the smartest person in a room, you are most probably wrong. And if you disagree with everybody else in the room, odds are you are really the dumbest person there.

Maintaining a distro is a lot of work, but until we can make software makers to agree on a single package format, a single way to manage configurations and a single way to organize the file hierarchy, the distro way will remain a very popular way to manage your computers.


It's wonderful how downvotes can replace disagreement and discussion. You've got to love the conciseness.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: