
Ingo Molnar on what ails the Linux desktop - keyist
https://plus.google.com/109922199462633401279/posts/HgdeFDfRzNe
======
muuh-gnu
The reason I stopped recommending Linux to "normal users" is _because_ of the
concept of distributions.

Coupling the updates of single apps with the updates of the whole desktop or
framework and libs, is just plain wrong. Having to upgrade the whole distro
(including all the other installed apps you dont want to upgrade) just to
install a new version of one single app you _want_ to update is a nightmare.
Total bullshit. Users. Dont. Want. That. Users dont want one update to trigger
another update, or even to trigger the upgrade of the whole desktop.

The blog post by ESR is one prominent example:
<http://esr.ibiblio.org/?p=3822>

He basically wanted to upgrade just one (obscure) app, and the process
triggered the automatic removal of Gnome2 and installation of Unity. Just
_IMAGINE_ how nightmarish this must look for normal users. You simply dont
remove somebodys installed desktop sneakily from under their feet. You simply
dont. That feels like the total loss of control over your computer.

I personally had, during the last 10 years, people go from Linux (which I
talked them into trying) back to windows, _precisely_ of this reason, of
having to upgrade the whole distribution every few months just to be able to
get new app versions. They dont have to put up with this insane bullshit on
Windows, why should they put up with it on Linux?

This "distribution" bullshit is not what is killing desktop Linux, it is what
_already_ killed desktop Linux.

The other reasons why desktop Linux never made it (no games, no
preinstallations on hardware) are imho just consequences of the distribution
concept and the 6-month planned-obsolescence cycle. Nobody wants to bother
with something which will be obsolete half a year down the road. Nobody wants
to develop for a target that moves _that_ fast.

Windows installations, once installed or preinstalled, run for a decade.
Develop something, and it will run on a 10 yr old Windows your grandparents
use. Most people encounter new Windows installations only when they buy a new
computer. PC manufacturers know that customers will hate it when their new
computer OS is obsolete within half a year and that they wont be able to
install new apps, so they dont preinstall Linux, it's as simple as that.

If anybody _ever_ really wants to see Linux succeed on the desktop (before the
desktop concept itself is gone), he will have to give up on the distribution
concept first.

~~~
DrCatbox
But what is the alternative to a non-distributions based gnu linux operating
system?

How do you manage the 1000 packages and their libraries and dependencies? Many
of them have separate runtimes which may or may not depend on the other
packages runtime. How would you design an application sandbox to cover them
all?

What you're saying basically is that linux is failed. At least for me, since I
cant see another way to distribute and manage the massive amount of packages
that sit on gnu.org.

I agree with you, and It is my opinion that the one who solves these problem
is in for a lot of business-opportunities.

~~~
sandGorgon
0-install (<http://0install.net/>) or even better - Nix packaging system
(<http://nixos.org/nix/> \- they claim to be a purely _functional_ package
manager)

~~~
nona
Damn straight! I don't understand why so few people know about Nix.

I'm hoping Nix(OS) will take off eventually. I try to play with it from time
to time, but I haven't made the investment yet to really contribute.

~~~
sandGorgon
maybe if someone would build a tool that pulls from Ubuntu repository and
recreates an equivalent nix package.

come to think of it, there is nothing in aptitude that prevents this kind of
an approach (including the unique hash paths). It will get you to 75% of Nix
functionality, which IMHO is good enough.

I dont think Redhat.or Ubuntu will want to do that though. Their business
model revolves around the walles garden.

------
bergie
The solution for this is quite simple: split the distro to a core, and a
separate 'apps' repository, and let app authors control (but with a mandatory
QA step) when their software gets released or updated. We did this with Maemo
(and later MeeGo), and it has worked great:
[http://bergie.iki.fi/blog/application_quality_assurance_in_l...](http://bergie.iki.fi/blog/application_quality_assurance_in_linux_distributions/)

Cross-distro app repositories are also a possibility, thanks to the Open Build
Service (<http://openbuildservice.org>). And since MeeGo's community apps
service is open source (<https://github.com/nemein/com_meego_packages>), all
software needed for this (including an app store client app) already exists.

What is needed is a major distribution to make the first move on this.

~~~
DrCatbox
You would still need to maintain and manage the core distro. The packages in
their would update and break the systme, you'd have lvm issues and what not.

Any way, care to give links to more on maeom/meegos architecture?

~~~
bergie
You still package and build an application separately for each distro version.
And then the community tests it before it goes to the stable apps repo. So
broken packages (either by packaging, or by app not functioning properly) are
unlikely to pass QA.

So if your package conflicts with something, either fix that issue or don't
release for that distro versio.

~~~
joe_the_user
Off-topic: Is active Meego Development still happening? I'm a QT developer and
might be interested in contributing a little...

~~~
bergie
The MeeGo community effort continues at <http://merproject.org>

Used already as the base distro in the Spark tablet

------
kfcm
The Linux desktop is dead. Face it. Accept it.

How do I know this? Go to a developers' or tech conference, and what is the
prevailing OS? 9/10 times these days, it's MacOS.

As much as Apple may be control freaks, they did desktop UNIX right and
developers have flocked to them.

Meanwhile, the Linux desktop community is still having the same debates, the
same difficulties it had back in the mid-90's. Dependency issues. Lack of
compelling use case software; and that which does exist is versions behind
current. Fragmentation.

The only areas in which the Linux desktop has moved forward is UI and user-
oriented management, and Ubuntu deserves much of the credit for the latter.

Add to this the gradual movement away from the desktop paradigm to mobile. The
desktop paradigm will still be around for several years (especially for
creators--developers, media folks, etc), but the end-user is moving more and
more towards smartphones and tablets for consumption. Extend those with
external keyboards and monitors (think docks) for light creation work
(documents, spreadsheets, etc), and you've the future.

Nope. The Linux desktop is dead.

~~~
javert
_The Linux desktop is dead_

This is really premature because there are major flaws with the other options.
Apple is a closed-down walled garden (and it's expensive), for example.

The Linux ecosystem is extremely diverse, so maybe somebody will come up with
something that can compete well with these other flawed models.

For example, maybe Ingo Molnar's post will inspire people to take things in
new directions.

 _The desktop paradigm will still be around for several years_

That's a ridiculous understatement. You can't do serious work on smartphones
and tablets and that's not going to change until we're going around plugging
them into projectors and keyboards, at which point they're just serving as
desktops anyway.

------
PaulHoule
I've been using Linux since 1993, and back then, the Linux OS and desktop were
far superior to Windows 3.1.

Then Win 95 came out and that had a decent desktop. I remember when the KDE
people started talking about a desktop for Unix and people didn't get it, but
when we saw the beta it was like... Wow!

Then Red Hat Linux didn't like the license of KDE, so they had to create
Gnome. As a result, rather than having one good Desktop, the average Linux has
two half-baked desktops. This fork has wasted people's energy and been a
distraction away from an excellent experience.

Another example of this is sound. I don't know how many incompatible sound
APIs exist for Linux now, I know it's more than the fingers on one hand. The
consequence of it all is that often sound doesn't work and unless you're a
crazy enthusiast you might never get it to work.

I was a Linux zealot until 2003 or so when I had a job that had me using a
Windows machine a lot, and by that point there was Win XP which was a huge
improvement over Win 95.

I still use Linux on servers, but desktop Linux has largely disappeared from
my life. Every so often I try to install it here or there, but I typically
find the experience disappointing. I was a Fedora fan for a long time, but
Fedora became increasingly finicky about where it would install. I switched to
Ubuntu, but every installation ends up having some serious problem.

For instance, Ubuntu installed just fine on my PPC Mac Mini with the exception
that the fan runs full speed all the time and the machine sounds like a vacuum
cleaner.

Windows and Mac OS have been on a general trajectory of improvement --
sometimes there are changes you don't like, but the overall direction is good.
Linux did, after years of struggle, get a stable multiprocessor kernel (2.6)
but other than that I get the feeling Linux has been going backwards not
forwards.

~~~
bryanlarsen
What you say used to be true, but does not appear to be any longer.

When was the last time we had useful improvements to the OS X user interface?
10.4 (2005) or 10.5 (2007), in my opinion. They've certainly improved under
the hood, but the improvements to the UI have been mostly gimmicks like
Expose.

When was the last time we had useful improvements to the Windows UI? That
would be Windows Vista, 2006/2007. (W7 was basically just a stable version of
Vista). Windows is certainly attempting to add improve the UI with Metro, so
it's a 5 year timeframe to wait for improvements.

OTOH, in Linux land we've had KDE4, Gnome3 and Unity all land in that time
period. Every 6 months we receive useful new improvements to our UI. Sure, the
initial reception to KDE4, Gnome3 and Unity were all negative, but the haters
are always the loudest. I haven't tried Gnome 3, but Unity 12.04 and KDE 4.8
are both really nice, much better than the OS X or Windows 7 UIs, in my
opinion.

And it's not just the UI. It takes about 10 seconds for my computer to leave
the BIOS and have both Firefox & Emacs open in Ubuntu. It takes the same
machine over a minute to have Steam open in Windows.

~~~
kfcm
There's just one problem with your post: you're thinking like a technologist,
and not like an end-user.

End-users don't want to have to learn an entirely new UI (read, a different
way of doing things; or, "Where's my Start button? Everything I know how to do
is under that.") every couple of years. Not because they're (all) dumb,
stupid, or lazy.

It's because end-users view a computer as a tool to do what they need/want to
do--quickly and efficiently. Anything that distracts from that (like having to
re-learn where everything is, and how to do the task they've done the same way
for several years) is a negative and annoyance.

Unfortunately, the technology community has forgotten that.

~~~
javert
_There's just one problem with your post: you're thinking like a technologist,
and not like an end-user._

Well, a lot of people who use and hack on Linux distros aren't targeting the
"end-user," they're targeting the technologist. So that mindset is not
necessarily a problem.

~~~
PakG1
If we're going in that direction, then everyone's participating in a different
conversation. The original conversation was not whether or not this is a non-
problem because end-users have issues. The original conversation was why
hasn't Linux been adopted by end-users at the same rate as other OS's, if it's
superior in so many ways.

------
programminggeek
The author is right mostly, the problem is friction and market dynamics.
Adding software in a given linux distro is too nerdy. If you don't have an
"app store" like gui, you've already lost most users. Command line = friction
for most users, even a lot of developers who grew up with windows.

As a linux nerd, I'm fine with command line + synaptic, but look at how well
people have used the iTunes store, Amazon store, Google Play, etc... All of
those have much less friction to find and download the right software than
most linux distro's have. Ubuntu's market is close but...

WHERE IS THE NON FREE SOFTWARE?!!

If linux wants to do well for humans, paid, proprietary software NEEDS to
exist on the platform. Ubuntu Software Center comes close, but it still kind
of sucks.

Also, as a dev, it wasn't until VERY recently that you could even sign up to
publish an app that was commercial in nature. It is hard to build a real
marketplace when you're asking developers to give away all their work for free
so that you can sell more operating systems (or support contracts) without the
software dev seeing a dime.

As a software developer I can't feed my kids with free downloads on an open
source operating system used by people who don't like paying for software.

Make it easy for devs to build software that people will pay for, then get
operating system users who will buy that software for real money and you'll
have fixed the Linux Desktop problem.

~~~
downx3
App stores can suck too, think Blackberry. You don't have to resort to the
command line to install packages. Synaptic just needs a little freshen up
that's all. I far prefer it to Ubuntu's software centre. A load of crap icons
and screenshots just feel like clutter, and don't tell me anything.

------
antirez
Ten years ago I was trying to persuade people that two things were very
important for Linux to succeed in the desktop:

1) Distribution of software as a cross-distribution package that just has
everything it needs inside a directory, libs and so forth. If you said this N
years ago you were an asshole because "duplication of file blabla" and so
forth. The typical example was "an user should simply go to some web site of
some application, download a file, and click on it to execute the program".

2) Device drivers with a well specified interface between the OS and the
hardware, so that different versions of the kernel could use the same driver
without issues.

People complained a lot with technical arguments, about why a different
approach is better than "1" or "2" from some kind of nerd metric. So the
reality for me is that Linux does not succeed in the desktop because it is
"run" by people with a square-shaped engineering mind. There is no fix for
this.

------
cs702
Molnar's point about the _political_ and _procedural_ difficulties of adding
new applications in the official repositories of most distributions is true
(although this is changing -- witness Canonical's Ubuntu Software Center,
PPAs, and "universe" repositories).

But his reasoning breaks down when he says the relative dearth of _commercial_
applications for the Linux Desktop is due to this issue. That's not true.

The main reason why OSX/iOS, Android, and Windows attract more _commercial_
developers is because those platforms have a much greater installed base!

~~~
muuh-gnu
> The main reason why OSX/iOS, Android, and Windows attract more commercial
> developers is because those platforms have a much greater installed base!

A few years ago, Linux had a bigger installed base than both iOS and Android.
Both of them outran linux with ease.

~~~
tikhonj
Yeah, but they're in a brand new market on brand new devices. People who
already have a proprietary PC OS aren't going to switch to Linux out of the
blue, but they are going to buy a shiny smart phone.

So the two situations are really not comparable. I still think the main
difference is that you have to install Linux yourself. It's not even that
installing is _difficult_ \--it isn't!--it's that normal people don't even
realize it's an option. Your average random laptop buyer who just spent $600
on a laptop from Staples would be able to use Linux perfectly well if that's
what his laptop came with--I suspect some wouldn't even realize it wasn't just
a different version of Windows. But since his laptop invariably came with
Windows, that's what he's going to use, not for any reason but inertia.

~~~
muuh-gnu
> but they're in a brand new market on brand new devices.

But they still had no problem starting with an ecosystem with zero apps. Now
they have hundreds of thousands.

> I still think the main difference is that you have to install Linux
> yourself.

Over the years, I've had several people, whom I talked into checking out
Linux, give it up and go back to Windows because they refused to accept that
they have to upgrade the whole distribution just to be able to install new
versions of single apps. Did you ever try to explain a Windows user what a
"backport" is, and whats it good for, and why there are none on Windows and
why he can he can install whatever app and whatever version of an app he wants
on Windows, but cant on Linux?

> Your average random laptop buyer who just spent $600 on a laptop from
> Staples would be able to use Linux perfectly well if that's what his laptop
> came with

For 6 months, then he wouldnt be able any more to update his apps.

------
diminish
I am running on linux desktop for already 5 years together with 50 friends,
peers etc. It does not quite seem write to compare 90 cent mobile
applications, which are 'mostly' few bunch of screens interfaced to a service
otherwise given by a web site or simple 80s area arcade games, to packages in
a linux distro. the author is totally making a terrible mistake here. mobile
apps lack in size, complexity and who said they are great? they are mostly
consumables which perish in few days or months (I exclude some ).

Ubuntu is already trying to be more flexible with software center and
applications, however open source is not single walled garden which can adhere
to tight monotonous architectures found in mobile.

In an linux distro the apps are diverse, and programmed in multitude ways by
all possible programming languages (from python to lisp to C) and
environments. That is reality and life, and what must be done must be done by
being aware of that fact. and No, no one can force the broad, diverse open
source world to a tight control a la Apple.

~~~
moonchrome
>they are mostly consumables which perish in few days or months (I exclude
some utitities).

This is simply false, there are plenty of complex apps with varying degrees of
usefulness but they certanly don't reduce to 80's game clones and few screen
apps. And even if it was true it's not a result of anything intrinsic to the
app model. Also "tight control a la Apple" indicates that you missed his
point.

How many times did you do make and sudo installed shit on your system to get
basic app that should for all intents and purposes be walled of in a sandbox ?
Even without sudo why should install script have access to my personal files
without explicit permissions ? Did you run in to a situation where you wanted
to install the latest version of an app for your system but it wasn't in your
distro repo so you decided to build it only to find out that your repo GTK+
library was out of date and your options were rebuild all GTK+ packages or
update the OS to alpha ? Even I gave up at that point - imagine the average
user wanting to get the latest feature advertised on his favorite app site.

I use Linux desktop daily but it can be a hell and you need to know how to
wrestle with the system, it's certainly not idiot proof - and it needs to be
for mass adoption. But then again "mass adoption" (IMO) shouldn't even be
considered a realistic goal, being useful to techies/developers and available
on servers is a good objective too.

~~~
icebraining
I agree that having to make and sudo install is essentially impossible for a
common PC user, but how does the sandboxing have any impact in user adoption?

~~~
barrkel
State is the big evil in software. Minimize shared state, and many, many
problems go away.

Centralized package management with a web of specific versioned dependencies
is a huge hairball of state as it is. And when installed applications poke
around inside your system, it only takes one or two oversights here and there
to mess up this state.

Sandboxing, where feasible, minimizes shared state; usually at the cost of
other things, but with modern software and modern resource constraints, our
ability to deal with complexity is usually the tightest constraint. Sandboxing
reduces the probability that things will break horribly (e.g. installing one
app, and finding you need to install a new version of a library, which in turn
means you need to reinstall 100+ more apps, which may bring in a new desktop
or some other abomination).

~~~
icebraining
That has nothing to do with sandboxing: it's a matter of distributing the
libraries with the package itself, either as .so files or as a big statically
compiled binary, and using those instead. Nothing forces the developer to use
the distro provided libraries.

And frankly, I think this is one of the things where you can't please
everyone; personally, I rather have dependencies and know that a security fix
in a library will fix all applications that use it than having to hope each
developer releases an updated version, like on Windows.

~~~
barrkel
That's true to the degree that the implementation of those libraries does not
require other things from the rest of the system. If they are just libraries
of code, attached to nothing, sure.

But when they need integration with things beyond that? That's when it breaks
down. That's when you need interfacing libraries that adapt one version of a
library's interface to the next version, and very careful API and ABI design.
I see that there's very little reason to take that approach for open source
everywhere, but that kind of environment is essential for the stability
commercial development needs.

Static libraries aren't the answer. A careful API and ABI interface to the
core system, with managed compatibility from release to release, is needed.
This is hard work; incredible amounts of effort went into making Windows 95
work as smoothly as it does, and some of the best war stories from Raymond
Chen's blog date from this effort.

Things like preserving bugs in the moral equivalent of malloc to accommodate
software that used memory after freeing it. This very concept - bug
compatibility - is something that I see OSS community in general as being
fairly hostile to, thinking that the software with the bug should be fixed,
rather than making things work for the user. Of course the buggy software
should be fixed; but making things work for the user is more important still.

~~~
bryanlarsen
Actually, your post is making me optimistic.

On Linux, the 4 most common API layers encountered by an application are:

1) system calls 2) standard (and not-so-standard) libraries 3) X11 4) d-bus

#1 is notoriously stable. Linus often posts quite a vehement response to
anybody who proposes breaking #1.

#3 was for a long time, TOO stable, it wasn't taking advantage of new
developments in hardware and design. The breakup of XFree86 freed that up.

#4 is relatively stable because it's a cross-distribution partnership. They're
designed to be the same across multiple distributions which by necessity makes
them stable.

The problem are for Linux is #2. Most standard libraries are actually very
stable on the micro level. The problem is the vast area they expose: if a
single backwards-incompatible change can break your app, it's a problem if
there are a million things that can potentially change.

Windows & OS X applications "solve" this problem by shipping their libraries
with their applications.

I think Linux has a potentially even better solution: simply allow multiple
versions of applications and libraries to be installed simultaneously, a la
rubygems & bundler. You'll get all the disk space savings of shared libraries
at the point of initial install, which will slowly erode as users install new
apps, upgrade some apps but not others. But who cares? Disk space is cheap,
but more importantly, if the user cares he can do something about it, only
upgrading applications when the whole application upgrades.

------
may
Igor's point about distros trying to "own" 20K packages is well-taken; it's
simply not possible.

In my own life the 'solution' I have found is to use FreeBSD; I get a stable,
well-maintained core with a sharp distinction between core, userland and
third-party (the ports system).

I have found the ports system to be a lightweight, agile alternative to
GNU/Linux package managers:

When you install FreeBSD you are left with a kernel, standard UNIX command-
line utilities and everything you need to hammer the system into a finely-
honed tool.

Right now, I'm using it exclusively on my servers because I'm willing to
accept the trade-offs of Ubuntu (beta 12.04 on my dev box, XUbuntu 11.10 on my
netbook) on the desktop; a little instability and fully-automated updates is
OK in exchange for not having to fiddle with graphics drivers, sound, Flash,
etc.

------
drdaeman
I'm just an ordinary user, but I'd put my 2¢.

Nowadays, any developer can create a packages for his application (and any of
its dependencies) and publish it in self-hosted repositories. Users can easily
add such repository to their system's sources list and bureaucracy problem's
over. Well, some developers are doing this already - the only thing that keeps
the rest of them is either ignorance or complexity of the packaging process.

I believe there's no need for an Android market clone (which is yet another
centralized repository). What users may need is just a directory, pointing to
external repos. Ubuntu market seems somehow promising (at least I remember
seeing some dialogs like "you need to enable this source to install that
package").

Content duplication is not a problem. A real problem is keeping the system up-
to-date when you _want_ to update some library, because of an important
feature or bugfix. And it's nearly impossible with every application's
bundling their own copy of that library, with some copies being actually
incompatible forks (and a lot of copies being just different builds - think of
different compiler versions - of exactly the same sources).

~~~
Aqueous
What you think is easy is actually not easy. Any time you've added one, two,
three more steps to a problem after clicking the link in the browser you've
already lost. Adding repositories? Already too late. Touching the command
line? Sorry, as much as we love it, for most users it's already too late.
Going into Aptitude or Synaptic and pasting in the repository URL? Yep. Too
late.

This is all compounded by the fact that there is no app bundle. Mac OS X has
the bundle and a terrific way to install it: Drag and drop it into the
Applications folder, just like we did in the days of the Mac Classic and
MacPaint. It hasn't changed. (Well, it did for a while, but thankfully they
went back.)

If you say, "Well, that means you end up installing 5 different versions of
the same library on the same system" - Who cares? Disk space isn't a priority
any more, and from a developer standpoint it makes a lot more sense to target
a dependency whose version number I know, instead of a dependency whose
version only the package maintainers know for sure. I don't want to be forced
to target v1.3 of a library if it's only been tested (by me) on v1.2. It makes
for much better application stability for the developer to be in control of
dependencies and not these package maintainers.

I want to ship a self-contained bundle of awesomeness, not a dysfunctional
shard among ten thousand other shards.

~~~
drdaeman
> Adding repositories? Already too late. Touching the command line?

Nope. As simple as clicking a link with special URL scheme, like
`apt+hXXp://archive.canonical.com?package=acroread?dist=feisty?section=commercial`

> This is all compounded by the fact that there is no app bundle.

I'm all for the bundles (which single-app repositories, actually, are!), but I
want them to be non-monolithic (i.e. contain multiple separate packages).

I don't care about disk space — if I'm _that_ constrainted with disk space
that's probably another story that'll probably never happen to most ordinary
users, having terabytes of storage. But I certainly care about bugs, and if
libXYZ 1.2 has a critical one, I want my system to be free of that version
ASAP.

And I don't care that you've never tested your awesome app with 1.3 — it's
better to be _possibly_ unstable than _certainly_ unstable or, far worse,
vulnerable.

~~~
Aqueous
What makes package maintainers uniquely qualified to patch dependencies and
upgrade them? Either we say Canonical or Red Hat hires the best possible
people to watch over their package repositories or we say that a qualified
application developer could do just as well. Either way we end up having to
trust somebody.

Both package maintainers and developers have an interest make sure their
programs don't introduce vulnerabilities into the system. Therefore if there's
a serious problem with one of their dependencies vulnerability patching will
happen either way.

The distribution maintainers should be in charge of maintaining a core set of
low-level dependencies that are needed by many applications. Beyond that they
should leave the dependency management to the application developers.
Seriously. That would free up so many millions of man-hours of work for say,
Canonical, that they could actually make the core system usable to the average
user.

------
spiralpolitik
UNIX in general never transitioned well from single binary applications that
could just be placed in /bin /usr/bin or /usr/local/bin as appropriate to
multi-file applications that needed an array of libraries, resource files etc.
That coupled with a file system layout that nobody could ever agree on made it
a mess once UNIX moved outside the curated environments it was typically found
in prior to Linux.

NextStep made a good stab at the problem with bundles (.app .service
.framework etc) but once you moved outside of the abstraction layer you were
right back into the mess. Most Linux distributions seem to want to emulate
SunOS circa 1992 and any attempts to "improve" on the solution in drowned out
by fundamentalism.

I actually though the author nailed it in the last paragraph of part 2 of his
post. The free software movement needs to start looking forward and not try
and emulate what worked 20 years ago. There is definitely potential for a
brave organization that is willing to try and tackle the challenge.

------
jamesu
My impression is that the problem is more fundamental: a lot of open source
and free software which finds its way onto linux is made by people who don't
seem to care about the end-user experience. If it works for them, there is no
need to improve it.

Not all open software projects are receptive to changes, improvements or bug
reports from strangers so nothing is really resolved without forking by which
adds its own complications.

Of course there is still good open source / free software. It's just hard to
come by.

~~~
tikhonj
Even more proprietary software that doesn't make it to Linux is written by
people who don't seem to care about the end-user experience. I've spent enough
time battling horrible programs on both OS X and various versions of Windows
to know that this isn't unique to open source software.

Of course, on the proprietary platforms, the _core_ programs--browsers,
office, media...etc are all good. But that is true of Linux programs as well.
And open source projects--even ones that are not terribly responsive--are
still _more_ responsive than most proprietary programs.

------
icebraining
So, how does the fact that any person or company can host their own packages
or even full repositories that can be added _with a single click_ ¹ and are
not dependent on hierarchical organizations fit in that?

GNU/Linux distros, at least APT based ones, are perfectly distributed if the
person wants to.

¹ If you have apt-url installed, which Ubuntu has _by default_

~~~
moe
I agree that distributed repositories are the way forward.

The only thing holding that back is the _atrocious_ user-interface. Debian
urgently needs to fix that and push their apt-infrastructure out of the 1990s.

In short: /etc/apt/sources.list must die.

This is how it must work:

    
    
       apt-get install https://foobar.com/debian/squeeze/widget-1.0
    

A package installed like that must add itself automatically to a proverbial
sources.list for future updates. Don't bother me with the housekeeping.

Then add central indexing ('apt-get install widget' is nicer), a web-of-trust
("1234 users have installed packages by this author"), package-signing looks
about alright already (except: no, don't make me run gpg).

This can (will and does) co-exist happily alongside the centralized
repositories. Someone just needs to implement it and push it through the
glacial Debian processes.

And while we're at it, there's no reason 'apt-get install github://foobar'
can't be made work.

tldr; apt needs to absorb homebrew.

~~~
icebraining
But what's the problem with sources.list, as long as the user doesn't have to
manage or even know that it exists?

Provide a .deb from your website, use the install script to add your
repository to the sources.list. There, the user doesn't have to know or care
about that file. And this is all possible - no, easy - to do right now.

 _This is how it must work:_

I disagree, having to run apt-get is too cumbersome to a regular user. But a
better way _already exists_ in the form of apt-url. Click a link, have the
package downloaded and installed automatically.

 _This can (will and does) co-exist happily alongside the centralized
repositories. Someone just needs to implement it and push it through the
glacial Debian processes._

But my point is that most of the infrastructure (support for multiple
repositories, one-click installation of third-party packages) already exists.
That's why I don't agree that this is the problem with Linux.

 _And while we're at it, there's no reason 'apt-get install github://foobar'
can't be made work._

I don't see how - there's no standard on GH projects for installation; some
projects are installed by simple make/make install, others with
easy_install/gems/npm, etc.

~~~
moe
_But what's the problem with sources.list, as long as the user doesn't have to
manage or even know that it exists?_

You are right. My point was that I (the user) shouldn't need to know that a
file by this name exists. Of course debian is free to maintain the bookkeeping
in any way appropriate.

 _I don't see how - there's no standard on GH projects for installation; some
projects are installed by simple make/make install, others with
easy_install/gems/npm, etc._

People would check-in either the deb or the debian/ meta-files. The latter
would be preferable, but the deb package-building process needs to be
simplified for developers to consider participating.

The current deb building procedure is a trainwreck, which is in fact a related
problem. Homebrew recipes and gentoo ebuilds are trivial in comparison,
there's no reason the deb process needs to be as convoluted as it is.

------
Edootjuh
I disagree. The author seems to forget the fact that packages are only a layer
of added ease of installing new software. The act of installing new software
from source or binaries distributed by the author is still as free as ever.

I admit that I do install about 90% of my programs as packages, but the
problem of central authorities responsible for patching and distributing
software and taking too long to do it isn't present in every distribution.
I've used Arch Linux for years now, and it solves this problem by separating
the packages into an 'official' channel of reliable maintainers testing and
releasing new versions on the package system and a user repository where
anyone can add packages.

To me, this seems like the optimal solution. Community-maintained packages can
be promoted to official ones, from what I can see new versions are released
from testing within days and if you're not satisfied with how others maintain
the packages, building the packages from the newest versions yourself is
almost as easy as installing binaries from the repository because anyone can
use the build and packaging scripts used by the maintainers themselves.

------
wazoox
There are "core" linux distros, like Slackware. There are many attempts at
autonomous apps distribution like zero-install and openpkg that mostly work,
and probably would work fine with a reasonable effort backing them.

[1]: <http://0install.net/> [2]: <http://www.openpkg.org/>

Of course the problem is that the big distros (redhat, debian) can't be
bothered to care.

~~~
sirclueless
And in fact, with their sysadmin focus, they explicitly are opposed to this
method. If your goal isn't a desktop experience but rather protection from
every adversary at all times, then only vetting a small core and letting
people run various and sundry packages from third parties is flat-out
irresponsible. But of course this is what desktop users want -- to run any
software they like with no hoops to jump through or update schedules to keep
on top of, so they will always be at odds.

------
keithpeter
Yes, I see the point being made in the original article. I tend to use LTS
Ubuntu and conservative distributions (currently PUIAS) and so see a slower
rate of change.

Another Hacker News thread is discussing the new release of Audacity.

<http://news.ycombinator.com/item?id=3714766>

and it occured to me that I would like to try to compile a _statically linked_
build of Audacity that could work on any version of GNU/Linux from Ubuntu
12.04 down to (say) CentOS 5.7. Just a big binary blob that I could copy and
run.

How would I find out how to do this? I've compiled little things before (dwm
window manager, qalculate)

------
alexchamberlain
It really frustrates me that authors are not responsible for compiling and
distributing their software, only then will we have software upgraded
regularly and quickly.

~~~
kklimonda
which distribution should they compile and distribute (sigh) for? Not to
mention basic problems like "should I integrate with GNOME, Unity, KDE or
nothing at all?"

~~~
drdaeman
There are only 2 popular package formats: deb and rpm.

I don't have any experience with RPM, but to build a package for a reasonably
big part of the Debian-based world, you have to set up a build system
(pbuilder/cowbuilder) and tell it something like `for DIST in lenny squeeze
wheezy sid lucid maverick natty oneiric pangolin do; git-buildpackage ...;
done`

The problem is, to get it right one has to find and read _TONS_ of
documentation.

~~~
Shank
It'd be a lot more popular if it was as easy as the Android apk export wizard.

My 2c: Improve the tools and the developers will follow.

------
__alexs
This is basically why Ubuntu has PPAs and what they are trying to turn
Software Center into right?

------
richardk
I've been happily using the stable branch of Debian for about 4 years now.
Whilst I agree bureaucracy and politics should be avoided where possible, it
seems to me that handing everything off to a 3rd party to focus on the "core
packages" harms overall system performance.

Time and again I hear people in the lab complain about all these bugs in their
applications, running Debian I can honestly say I don't have this problem.
Ofcourse, I don't have the latest software either, but for me that's the price
I pay for a stable system.

------
freshhawk
Most of the complaints here don't even have any suggestions, just a complaint
that can be easily translated to "dependencies in software are hard to
manage". Well, no shit. How about some interesting ideas around this from the
"hackers"?

The ideas that are proposed are mostly things that have already been tried but
have failed because of social/manpower reasons (they would take enormous
amounts of effort and time from all involved for little benefit) or technical
(they don't work)

A large percentage complain that the distro in question updates _too_
frequently, when there are clearly distros that cater to stability (they just
aren't the "cool ones").

Some of the complaints are demanding some mythical OS that allows you to
install it once, never have it update or change but still have access to all
the newest software. That would be wonderful. No one has figured that out yet,
not windows, not mac, not any *nix.

I know bitchy comments aren't helpful, or likely to be well received, but
there must be some other people of my ilk still on HN. Now that it's Product
Guy News, where should I be going? Where is this story posted with people
actually talking about interesting ways to improve things who actually
understand the problem and could be called hackers without the technically
skilled people laughing?

------
zackmorris
Part 2, where he discusses his solutions, is more enlightening I think.

He's dead-on about the impassable problem of package updates affecting other
packages. Having everything sandboxed with a general permissions system for
directories instead of per-file is also better (this is how MacOS wanted to
work before OS X). A free and open mesh network with reputation-based security
is also the future.

But hey, linux is wide open, if these are the changes that are needed, we will
see them.

------
pavanky
Isn't the problem that the distributions are downstream to the apps and
libraries they install ?

Quoting android and ios ecosystem is well and good, but in those ecosystems
the os comes first and the apps are developed downstream. You simply cant have
that in a linux eco system.

------
cturner
Why do linux distros keep trying to recreate the gaudy, confusing experiences
of the proprietary systems? I want my drivers to work, and a window manager
that wraps the file-system - allows interaction with the files. In an ideal
world, someone would restructure the four bin directories so that system stuff
was in one area, and user stuff was somewhere else so we could easily open
what we want by navigating the tree.

------
danbmil99
Just for another POV -- I love the fact that I can apt-get a version of
practically any FOSS project of note, and within a minute or two, I have
something that works with the rest of my system. If I need bleeding-edge, I go
to the project page and download a later binary or source, but 90% of the time
that's not necessary.

For me, it's a perfect combination of a vetted ecosystem (ok, somewhat closed
but closed in the way I like -- no crapware, all legit source distros and
mostly mature projects) with the ability to go outside that system at any
time, at my own risk.

Anyone can set up a repository to add to/compete with Canonical's, and of
course they do. So with my willow garage repo's, I can keep up with their
concept of what's stable, etc. It works nearly perfectly, IMSHO.

------
jiggy2011
Isn't the real issue simply that the desktop distributors simply lack the
manpower required to out together a stable desktop release, party due to the
massive fragmentation?

I assume MS and Apple have huge teams dedicated simply to making sure that all
of this software works together nicely.

If there was a desktop distro that cost $100 a throw and that money was re-
invested in testing the desktop platform more thoroughly and making it past
and future proof that this would solve many problems?

------
Abomonog
Sounds like he wants a distro agnostic Ports system, one that hides the dirty
work of watching compile time crap fly across the screen and just gives the
user a suitable package for their distro.

Either that or he's essentially suggesting we move to static compiled
packages, which while tremendously inefficient from a space and security
standpoint would alleviate at least some of the headaches of trying to do
cross distro binary offerings.

------
VMG
Maybe Android will conquer the desktop then?

All it needs is mouse and keyboard support in the interface and higher-
resolution apps, which will come for tablets anyway.

~~~
bergie
Android has pretty good mouse and keyboard support already. Try one of the
Asus Transformers in laptop mode, for instance. Keyboard shortcuts, mouse
cursor, two-finger scrolling, all works as you would expect on a desktop.

High-resolution apps are still somewhat lacking because the larger tablets
haven't sold that well, but I hope this will improve.

------
yaix
Good point. But most of these apps are written, because they can make money
for the author. There just isn't a large enough user base in Desktop Linux to
make that money, and hence much less people willing to invest the time to
write apps. And the much smaller userbase is split up between Gnome and KDE
and now Unity and Xfce and so on. So there is even less incentive to write
apps.

~~~
tikhonj
A program that works in KDE will work in Gnome and Unity and Xfce, so that
divide is not strictly relevant. In KDE, at the very least, even GTk programs
look very good. And, moreover, those programs would also work on Windows and
OS X.

------
jebblue
Linux is being used by more developers that I see than ever and even regular
people are finding out about it and trying it out.

------
snambi
The author mostly talk about maintaining packages. Some people say that it is
the missing games and other desktop applications. However, we all know that
Linux is best OS for development. It has best set of libraries, languages and
utilities. Yet, why there not many cool Desktop applications?

I believe the answer is, Linux doesn't have an IDE for Desktop development.
Look at windows, they have .net Platform and IDE, Look at Mac, they have Xcode
and Android uses eclipse for IDE.

Most of the application developers want and IDE, which is what is lacking in
Linux. I hope someone makes an IDE for Linux desktop development.

~~~
rurounijones
QT Creator for Cross-platform. KDevelop for KDE focused stuff. Not sure what
for GNome.

What is wrong with the above?

~~~
snambi
I didn't know about QT Creator, it looks really good for application
development. A lot of people are using Ubuntu for their desktop, including me
and my friends. But, Ubuntu's landing page doesn't have a tab called "develop"
for leading users to develop applications for ubuntu. So, My impression is
linux is a great OS with libraries and frameworks necessary for system
development. But, it didn't had the tools to build applications. But,
apparently it does. Linux distributions should encourage users to write
applications. The best way to do this would be having the information right on
their landing page.

------
netvarun
Part II of Ingo Molnar's blog post:
<http://news.ycombinator.com/item?id=3719719>

------
agentgt
I have to wonder if people's expectations are just a little to high for
something that is free. I think a large amount of people complaining about
Linux did not use it when it really really sucked (back when Slackware and
RedHat were the only options).

Linux is a hackers operating system. When people try to make it into a desktop
operating system it starts to suck (Gnome 3).

------
mark_integerdsv
Honestly, Linux is a lot like the tiptronic setting on my cars gearbox. I have
thought it's a cool feature and I'm glad it's there but really... I never use
it 'cos it's kind of fiddly and dumb.

------
ObnoxiousJul
what I prefer in this posts is the interesting discussion coming with the
article. It seems attention whores and trolls have not yet reached g+, it
looks pretty sane for now.

mode religious:on please g+ try somehow to make a good karma system to keep
the NSR low.

