Hacker News new | comments | ask | show | jobs | submit login
What killed the Linux desktop (tirania.org)
344 points by foolano on Aug 29, 2012 | hide | past | web | favorite | 378 comments

This is the money quote:

> The second dimension to the problem is that no two Linux distributions agreed on which core components the system should use.

Linux on the desktop suffered from a lack of coherent, strategic vision, consistency and philosophy. Every engineer I know likes to do things a particular way. They also have a distorted view on the level of customization that people want and need.

I like OSX. Out of the box it's fine. That's what I want. I don't want to dick around with Windows managers or the like. Some do and that's fine but almost no one really does.

Whereas Windows and OSX can (and do) dictate a topdown vision for the desktop experience, Linux can't do this. Or maybe there's been no one with the drive, conviction and gravitas to pull it off? Who knows? Whatever the case, this really matters for a desktop experience.

I have two monitors on my Linux desktop. A month ago full screen on video stopped working. Or I guess I should say it moved to the center of the two screens so is unusable. I have no idea why. It could be an update gone awry. It could be corp-specific modifications. It could be anything. But the point is: I don't care what the problem is, I just want it to work. In this regard, both Windows and OSX just work. In many others too.

I can't describe to you how much torture it always seems to be to get anything desktop-related to work on Linux. I loathe it with a passion. I've long since given up any idea that Linux will ever get anywhere on the desktop. It won't. That takes a topdown approach, the kind that anarchies can't solve.

> I have two monitors on my Linux desktop. A month ago full screen on video stopped working. [...] In this regard, both Windows and OSX just work.

Just because it's never happened to you on OS X or Windows doesn't mean it doesn't happen. OS X 10.6.7 broke the output on my 13" Macbook for either of the two external displays I own. Both worked fine previously, when booted from the install CDs, or from Linux on the same machine.

Plugging in my Firewire audio interface on the same machine spun the CPU up to 100% and kept it pegged there. A lot of good having a nice mic pre-amp does when you get a high pitched fan whir added gratis to all of your recordings.

It's silly to pretend that Mac is somehow perfect in these matters. In my experience it's only been marginally better than Linux, if at all. And with Linux you have some hope of finding a solution, whereas for OS X you're pretty much hosed.

> It's silly to pretend that Mac is somehow perfect in these matters.

Straw man. Nobody said OS X was perfect.

> In my experience it's only been marginally better than Linux, if at all.

I used Linux on my desktop for several years and have now used Macs as well for several years. I won't say this is bullshit because I don't think you are lying about your experience, but I think you are extrapolating way too far.

> And with Linux you have some hope of finding a solution, whereas for OS X you're pretty much hosed.

Forums and mailing lists are "hope for a solution" while the genius bar is "pretty much hosed"? How on earth did you arrive at this conclusion? That just doesn't seem reasonable.

I don't see any material difference between "hardware compatibility just works" and "hardware compatibility is perfect".

As for fixing the issue, both of my issues were kernel issues, as are most hardware compatibility issues. On OS X reverting to a previously working kernel would have meant backing up all of my data, reinstalling the OS from the DVDs, installing the combo update to the last working version of the complete OS (10.6.6), restoring my data (but not using Time Machine) and then never allowing another OS upgrade, including not updating the apps that come with the OS.

On Linux it would have been a matter of downloading the previous kernel package and installing it. That's it. The package managers are usually smart enough to figure out that you've forced a previous version of the package and won't replace it in future updates.

Neither solution would have been doable for a novice, but the Linux solution is far easier for an expert and a generally more palatable solution. And note, this was Apple's OS breaking Apple's hardware.

I still hold that Linux has a lot of "usability." It is extremely useful, extending itself into every possible use I can think of. It is an extremely useful tool.

That said, maybe Linux in it's current stage isn't destined for the pickup approach of Windows or OS X. I believe that Linux is not following the same path as Windows or OS X, and because of that it may not end up at the same destination.

I agree with the original article; Linux is produced differently, often with different goals than Windows or OS X. Trying to shoehorn it into the same path as Windows or OS X may not prove fruitful till we see someone with the drive to change everything onto the design path that Windows and OS X are taking.

Since you mentioned using Time Machine, why not revert to a good state using that (instead of a re-install)?

I really don't want to have to think about the kernel when I'm plugging in a new monitor.

Neither does anyone else, but when faced with a kernel issue that's what you're stuck with regardless which OS you use.

Strawman argument invalid: Mac being marginally better is a valid point because GP builds his argument on Mac being better.

Genius bar solution: Wipe device.

Every time.

That's the genius of it.

Really? That reminds me of Microsoft technical support circa 1993:

Step 1: have you tried rebooting the machine?

Step 2: have you tried reinstalling Windows?

I thought those days were over.

To stress out again; OS/X has the poorest hardware support for 3rd parties; even compared to Linux. Just "think different" and buy some hardware and you ll see what I mean..

Oh well now that you said it the argument is settled. You can't just say OS X has the poorest hardware support ever and not back it up with anything except "go buy some hardware and see what works and what doesn't". Is there maybe a table online that lists these incompatibilities you can link to? Please don't say "do some research" because the onus is on you to prove yourrself right, not on me to prove you right.

You also said "even compared to Linux". That's interesting. That statement there gives the impression that you're more interested in defending your own beliefs and/or choices rather than being truly interested in explaining to us what OS has objectively terrible hardware support. "even compared to Linux" sounds like something an apologist would say. Then when you added in the "think different" line you made it seem even more like your comment was based off some kind of blind loyalty to Linux rather than loyalty to facts.

Me? I've used Mac, a handful of Linux distros, and Windows for a long time. I don't know which has the best or worse hardware support but I do know when someone says something based on what camp they're in rather than what the facts are.

> Then when you added in the "think different" line you made it seem even more like your comment was based off some kind of blind loyalty to Linux rather than loyalty to facts.

Don't read too far into things, and "...loyalty to facts" eye roll.

Anyway, It's pretty common knowledge to Linux users that hardware/driver support on Linux can be a challenge. Sometimes things work right out of the box, other times you have to do a lot of work and a lot of Googling.

That being said, I don't agree with the parent. For basic hardware needs, Windows and OSX have a fairly high success rate of plug-and-play functionality. For me on Linux, it's about 50/50.

On windows, a fairly high success rate? This must be a joke. Try installing a graphics card without driver, you will see how well it is supported under windows. I consider that basic functionality. Same for wifi dongles, they usually need drivers to be installed in windows, while this is all taken care of at the kernel level in Linux.

In Linux you very rarely have to install any driver. True, some hardware remains unsupported, but the list of compatible hardware without any installation required is pretty long. On windows, you'll need to install stuff for most of new hardware you try to plug in, no matter what.

The fact that you have to install some driver is irrelevant. I'd rather install drivers for Linux than be unable to use the hardware at all.

I've installed both GTX 580s and 570s with absolutely no problems.

OSX was the first system where a printer worked without not even a popup saying that "drivers were being installed" (and I'm not even the biggest Mac fan, I'm typing this on Windows 7...)

The real difference is that on OSX is if something doesn't work it doesn't work. Under Linux it may work, or work badly, or you can probably hack it to make it work well. I used to think I liked the latter but there's something appealing about the binary clarity of OSX.

Thank you, that's exactly how I think, it's even this way with Windows. On other systems I would waste days after days of my life to set things up 'just the way I want' only to find that it's never gonna be quite that way. Macs just let me accept the things as they are and get back to work. Very often, I'd get to see after a while, why things work the way they work (say, Window management).

Anyway, this has been almost 100% true up from 10.3 through 10.6.x. Recent developments both on Apple's as well as on MS's side have closed the gap between the two systems quite a bit. I still prefer OSX but it's gotten close. The real reason I continue to buy macs nowadays is the hardware quality.

>On other systems I would waste days after days of my life to set things up 'just the way I want' only to find that it's never gonna be quite that way. Macs just let me accept the things as they are and get back to work.

This. It's almost as if Linux is a collection of parts and tools from which a sufficiently good designer can build a usable desktop whereas OS X is a usable desktop. That's an exaggeration, but for me there is some truth to it.

Note that Linux was my only desktop from 1997 through 2009.

OSX seems to have had this design paradigm: 'Make everything work in the most streamlined way possible for the default use case (80%). Give some limited options for a few more specific use cases (18%). Cover everything else with CLI integration / defaults system, such that the final 2% can get around with a bit of googling'

This has been highly successful, only I'm afraid that the Unix people inside Apple are loosing influence. The sandboxing in ML looks like a mess to me. Not that its a bad idea per se, but look what it did to our beautiful '~/Library/[ApplicationSupport|Preferences]'! It's tacked on and it's obvious that this has been a political decision from management rather than engineering. Makes me angry.

On a different note: I've never really used it extensively because I switched to macs in '05, but wasn't Ubuntu pretty close before they switched to Unity? I remember last time I installed it that for the first time I could have a Linux that worked out of the box, including network and graphics drivers.

I've actually switched to doing my work on Ubuntu (using xmonad - I can't stand unity) and iOS (iPad as SSH client) as OSX only cares about the 80% use case now.

Which is fine, it's Apple's decision to take, but it's no longer for me.

(As for Ubuntu - I'm using pretty standard hardware and never had a problem)

Part of it is you won't spend as much time thinking about it. Say you have a wireless keyboard that is a little dodgy on Linux, but works most of the time. But every now it fucks up and really pisses you off. If it never worked on OS X you wouldn't even remember it because you would have sent it back or resold it.

Printer drivers seem fine. Scanner drivers seem fine. USB mouses and keyboards work fine. Most Apple products can't really be expanded in many other ways.

I'd say the hardware support is fine for what little customisability Apple allows on the hardware that they sell. Perhaps the Mac Pro might have some issues with internal cards (I honestly wouldn't know ...), but those devices are not affordable for 99% of the people.

Wait what? Every scanner, printer, tablet, mouse, and even color correction device I've bought has worked on my Mac.

Then you've been selective about what you've bought. It's trivial to find devices that are Windows specific and/or have only gotten Linux and/or OS X support after extensive rounds of reverse engineering.

Low end printers and scanners in particular are notorious for skimping on the logic in the device itself and depending on a large blob of Windows driver code, with manufacturers that couldn't care less about supporting anything other than Windows.

I have no idea where this sentiment is coming from. I'm typing this from an MBP with a das keyboard, navigating with a razer mouse and viewing with a samsung monitor, everything works fine.

All keyboards are the same, all mice are the same, spectral analyzers are not the same, software-defined radios are not the same, USB-Serial adapters are not the same, FPGA interfaces are not the same, sensor boards are not the same, laser cutters are not the same, need I go on?

Hardware? What hardware? Frankly, I havent bought nothing over last 10 years. I have laptop, you see...

I bought Samsung Galaxy S3 recently, plugged it in via USB and was not able to browse it because, as far as I understood, MacOSX has no support for "media devices" whatever that is. I had to download some obscure Samsung "Kies" software through which I was able to get to the filesystem and upload some files. Wasted hours.

Starting with Honeycomb the storage partition is exposed through MTP (or PTP), this way it can be accessed concurrently by both Android and your computer. Before Honeycomb, said partition could be accessed as an USB mass-storage drive but that required it to be unmounted, thus making all data unavailable to Android while it was connected to a computer.

In my experience only Windows 7 has passable MTP support. On Linux I had mixed results with gvfs and mtpfs (slow and crashy). jmtpfs [1] is a nice replacement, though. Google also released an application for OS X [2] but I can't try it since I don't own an Apple computer.

My solution to the problem was to install a WebDAV server (such as [3]) on my device (Galaxy Nexus) and access it wirelessly.

  [1]: http://research.jacquette.com/jmtpfs-exchanging-files-between-android-devices-and-linux/
  [2]: http://www.android.com/filetransfer/
  [3]: https://play.google.com/store/apps/details?id=com.theolivetree.webdavserver

This is not an issue with Apple or Samsung, it's an Android decision.

Support for the USB Mass Storage protocol has been deprecated in Android in favor of MTP (Media Transfer Protocol) starting with Ice Cream Sandwich.

The experience in Windows and on Linux is just as poor with this phone. (I love the phone, though.)

Why can't Apple support MTP? Linux gets cited for lack of hardware or protocol support (MTP to an Android device works fine in Linux, btw), but it's not Apple's fault when they can't or don't provide drivers for common, standardized consumer electronics?

It seems fairly obvious that the reason is that facts which do not fit the narrative are not talked about. Confirmation bias in action.

MTP does not work fine in Linux (specifically, Ubuntu 12.04). You can read the files but not write arbitrary files into arbitrary folders.

I haven't had any trouble using a Nexus 7 with Windows, it's worked like a charm. What's supposed to be the huge problem with MTP, exactly?

I take that back, my Galaxy S III does work as expected using MTP with Windows 7. Previously my workplace was using an older Windows.

Oh my

This is 100% to blame on Samsung, sorry.

Even though my Samsung phone is seen as a USB device (and it works on MAC OS X), but maybe in the newer models they removed this functionality (and called it a feature)

People may complain that iPhones need iTunes but then again it's iTunes not the gigantic pile of crap that is Kies

No it has nothing to do with Samsung. In ICE Google changed how the partitions are handled: http://forum.xda-developers.com/showpost.php?p=11714077&...

It was pure engineering tradeoff to improve things in the long term. If anything OEMs would have preferred the old 'just works', sub-optimal, gives-an-excuse-to-obsolete-a-phone, lower support issues, solution. The ball is now in operating systems' court to support this standard in a way it wasn't envisioned to work a few years ago.

Even ubuntu has been slow on this front.

Nice explanation, if it's to facilitate upgrades I'm all for it!

That's the thing about samsung. They copied the bad stuff as well: weird connectors, itunes / kies.

This is samsung not following USB standards.

OS X has perfectly find support for media devices, and has worked fine with every digital camera, SD Card or other card reader, etc, that I've attached via USB in the past decade!

Actually it's lack of MTP support in OS X.


Wrong. It's Google's decision to abandon direct mounting Android drives in ICS in favor of the MTP standard.

Which is the right move in the long term, although it's certainly causing some pain in the short term.

Everything under the sun supports UMS, that OSX does is nothing special. That is not what his device was using though.

Laptops are not hardware now?

I have no idea where this sentiment coming from. I'm typing this from an MBP with a das keyboard, navigating with a razer mouse and viewing with a samsung monitor, everything works fine.

I've been using OS X going back to the betas. I've never seen OS X have problems with third party hardware that conformed to standards.

I have seen problems with third party hardware that required its own driver (poorly written generally) or that was half assed crap that was designed to work with windows but marketed as supporting standards like "USB".

Meanwhile, Linux has trouble dealing with the computer itself, let alone getting hardware attached to it. Constant pain in the ass.

Currently I'm in a country in a different hemisphere from the USA. I walked into a store and bought and off the shelf no-name brand displayport to HDMI adapter. Worked no problem.

I've gotten used to being able to do that... and haven't had much trouble getting things to work with OS X in years.

Course, I also stopped buying things that require custom drivers-- that's a clear sign you're not going to have a good time.

To claim that Linux is better in this regard is, quite frankly, asinine.

In my experience, Linux works great with standards-compliant hardware. Linux is typically the first platform with support for a new standard. The problem is there is so much hardware that is not standards-compliant.

Your comment looks about as asinine to me as the as the parent comment does to you.

Being the first platform supporting it in the kernel - and then you have to wait for another few months (or years in case of Debian) until you finally have the driver in your distribution. The alternative is compiling kernel-modules yourself - which then get broken again by the distro whenever it updates the kernel without warning. And to even compile you have to follow hints on several websites until you figure out the way to compile a module just for that kernel-version you are having (this all is from experience trying to get a Wacom tablet running - I gave up on it when it broke again after a distro update and the compiling mechanism then also no longer worked for some reason. Now waiting for Debian Wheezy to save me).

"Currently I'm in a country in a different hemisphere from the USA. I walked into a store and bought and off the shelf no-name brand displayport to HDMI adapter. Worked no problem."

That really has nothing to do with the OS you're running. The OS isn't driving the DisplayPort<->HDMI protocol conversion in any way (your 'no-name' brand adapter is probably relying on DP++ and just passing through the HDMI signal anyway).

This is like saying your VGA cable is compatible with Macs with a VGA port. Of course it is.

> I've never seen OS X have problems with third party hardware that conformed to standards.

The problem is that hardware often doesn't really conform to standards, even those it claims to.

One of the reasons device support in Linux (and other OSes which target a wide range of platforms) is difficult, is that it's necessary in practice to cope with and have workarounds for buggy and out-of-spec hardware and firmware. Just "coding to the standard" isn't good enough.

Such buggy hardware/firmware is rarely documented as such, and finding these problems and the appropriate way to handle them is painful and difficult work. In some cases the only practical way to figure out what actually works is to reverse-engineer what Windows does (the hardware manufacturers generally make sure that Windows works with their hardware, but rarely make such information public).

Apple's main goal is their own hardware, over which they obviously have a lot of control and information, so they really don't need to worry so much about this.

If the hardware conforms to standards, Linux is the most likely to be able to support it of all three - and that's even without having to install a driver.

>that was half assed crap that was designed to work with windows but marketed as supporting standards like "USB".

A device can follow the USB standards (and be labeled as such) while still requiring a special driver. The two are not mutually exclusive.

See, according to me, you don't get the point. Things DO break in OSX as well. Just that its a bit less. Apple has the liberty to control both the (primary) hardware as well as the OS which runs on it. Therefore it has an edge over Linux in terms of compatibility.

Having said that, barring a few "recent" desktop distros, as they call it, like Ubuntu, Linux has always been a bottom up approach. You need to install the base, then the userland and so on. For someone who is looking for ultimate customization, Linux makes sense.

But when I am running a notebook, I seriously want everything to work, albeit at a lower performance/customizability.

I always run either FreeBSD or Gentoo on my servers. But my notebook/desktop continues to be OSX.

I had a similar 100% CPU issue with an ExpressCard on a MBP... it was even labelled Mac Compatible.

Among non-ultra-geeky Linux users I know (mostly part-time users), "Linux" does seem to be reasonably standardized as far as desktop components go, because by "Linux" they mean "Ubuntu". It's possible the Ubuntu desktop dominance came a bit too late, though, and still isn't entirely complete.

My own experiences aren't great with Windows, though they're smoother on OSX than Linux. The problem I have with Windows is that stuff does break, but it's inscrutable how and what to do about it, whereas at least on Linux there is usually some way to do something about it if you're tech-savvy. If anything gets borked in Windows, it's wipe-and-reinstall time; back when I used it full-time I probably did that once a year on average.

I find OSX to be poor for package management, though. I've gotten so tired of headaches with installing and managing updates of stuff, and badly interacting upgrades of the OS and separately installed packages and system versions of python/ruby/etc., that I do most of my dev work in Debian in VirtualBox now. In that aspect, it's Debian that's centrally managed with real release-engineering and integration tests, whereas OSX is an anarchic mess of 3rd-party software that nobody's responsible for integrating and testing.

Windows is horrible to use, everything is poorly designed. Linux (Ubuntu in my case) and OSX just work sanely mostly, but OSX is a horrible dev environment, so I moved back to Linux.

I thought Windows was a joy to develop on until I moved to Linux... then I realised MS had to develop all these wonderful dev tools to disguise the fact that the API's are disastrous and the command line is useless...

Amen. That's the thing. You don't realize how many hoops you've been jumping through to do the same things that devs on Linus or OS X do effortlessly until you make the switch. But people don't make the switch because they have this blind loyalty to whatever theyre using. So whenever they see someone else loving a different OS they dole out ad nominee attacks, call the products over priced, or too complicated for normal users, or elitist. It's all to do with justifying one's decision to use X despite evidence that Y may be better or just as good and possibly some jealousy in some cases.

In any case it's all dumb blind loyalty the vast majority of the time. And that applies to Linux, Mac, and Windows. You can't have a discussion about operating systems with 90%+ of people because everyone just defends their favorite and facts are distorted to suit those preferences. It really is just about useless to have a discussion about operating systems as "substantive" or "meaningful" discussions about the subject are about as rare as unicorns.

Actually OSX development environment is only horrible to cross-platform / open-source application developers.

But if you are Apple pet developer then you'd just use XCode/XCode projects/etc for all your needs. And would be very happy with that.

As a cross platform open source developer that uses Mac OS X and Linux about the same (Mac OS X on desktop, Linux on laptop) I don't see what's so horrible about Mac OS X in this regard. At worse, it's the same. I don't prefer either for programming (I do prefer Mac OS X for other things, though).

Well, developing using native toolchain is horrible. Because of legacy gcc 4.2 and unstable and often broken clang. At least about a year back it was the case. No. Scratch that. It is still the case. I just remembered helping a coworker hack some open source package to make it compile with clang. We lost an hour or so, completely unnecessarily. Would have taken him 30 seconds to apt-get it on linux.

And well, not using the native toolchain, and using something like mac ports feels like I'm under cygwin all other again. No, thank you. I love 13" Air, but as a development environment Macs are quite broken.

Plus that home key on the full desktop keyboard... If you are linux+mac user, you know what am I talking about :)

Curious, what package management system are you using to install libraries/packages? Ports? Fink? Homebrew?

Okay, our use cases are very, very different. I mostly do Go now and used to do lots operating system development and embedded work. For Go, your compiler complaint does not apply, and for embedded work you need to build or install a cross compiler anyway. There are some cross compilers which you can get with apt-get, but in practice for the work that I do, there are no distribution provided tools, and you need to maintain your own toolkit.

Same applies to libraries, you can get everything you want with apt-get, as long as you only want things for your architecture. If you need them for another architecture, you are out of luck.

For operating systems you use virtual machines. I use the host just for running my text editor and for my Unix environment needs. All operating systems I've worked on (Solaris, Windows, BSD, QNX, some other real time embedded stuff) have their own tools and compilers used to build the system.

MacPorts, Fink and even Homebrew are FUBAR, I'm glad I don't have to use them.

>Windows is horrible to use, everything is poorly designed

Without qualifiers, that statement is meaningless. Perhaps you mean it's horrible for some devs? (Millions of others are doing just fine with Visual Studio).

For regular users, dropping them to Linux will be like replacing your mom's car's controls with the controls found in an airplane cockpit(command line) or playing "who moved my cheese ?" with the Linux desktop. Granted, everything is moving to the web these days, so why not just give them an iPad or a Chromebook?

Ubuntu works just fine for two kind of users:

1.) Grandmothers and the like who just want to read their mail and check the news.

2.) People like me who can and will use 4 + 2 + 1 + 0,5 hours to customize it over the course of 6-12 months to have an almost perfect working environment instead of living with a slower OS without virtual desktops.

The only thing I can think of that needs tinkering is some online banks that require applets to work, but then again even Windows doesn't come with Java preinstalled.

I think you mean Debian.

I know lots of people who successfully installed Ubuntu that don't know what a boot sector is but are still happy with it. (After all most people use web apps anyway.)

Wasn't talking about visual studio, I mean the UX as a whole. It is hugely inconsistent, slow, and hard to get sane information about whats going on. Once you are in an application, then generally its ok, its just up to that application, other than things like where new windows are created which are also broken. Oh and focus and switching, as things are always getting hidden.

Application UX inconsistency is a Mac thing AFAIK. Yes they all look 'pretty' and 'likable', but not consistent. Most decent Windows apps are modeled one after another and are very consistent. They have such things as status bars, sidebars, menus, list views. Usually standard controls, though times are changing with the advent of XAML and web-like interfaces.

And man tell me where should I have gone to get any information about why my _wired_ Ethernet connection wasn't working on Mac when it worked everywhere else?

[UPD: I'd give iOS apps 2/5 on consistency scale, Android, Linux and OSX 3/5, Windows 4/5].

Lots of programmers were doing just fine with punched cards and teletypes. That doesn't mean that punched cards and teletypes were not horrible. People just learn to be comfortable with their tools.

I really haven't found that that be the case at all with package management on OS X. Homebrew is almost always a joy to use, and on the few occasions I've had build errors using it I've found the Homebrew community connected to the official GitHub repo issues page to be one of the most responsive, friendly, and genuinely helpful open-source communities I've ever encountered. I agree things were a lot rockier back when the only package management choices were MacPorts or Fink, but Homebrew's a monumental step forward.

I can't say I've really touched Python at all, but as a relative newcomer to the Ruby world I found getting RVM set up and functioning properly on my Mac to be a complete non-issue.

Discovering Homebrew was wonderful after suffering the broken-by-design ports systems of Fink and Macports (a retrograde step from Darwin Ports). But it is not really strong competition for Debian's APT. For example, a day after first using Homebrew, I discovered that the Qt port simply didn't work. I fairly frequently have to install by hand software for OSX that has stable packages in Debian.

It's also true that Homebrew (and Macports, but not Fink) are doing something fundamentally less ambitious than the main Linux package managers: it offers a simple dependency system and scripting for compiling stuff, not directly installing the end product.

One significant difference between linux and windows/mac is the planning around features and the outreach. I work for a large organisation. The number of times a linux group has come round to our office to discuss what features are up-and-coming, or should be developed? None. The number of times Microsoft staff have done the same? Some.

Now I don't really expect linux volunteers to do this, but equally I don't really expect any linux distro to provide a coherent business desktop because they are not talking to their "customers" in the same way that their competition is. And the business desktop is a large market, if a bit unsexy.

There was always a dominate player it seems. Most people used RedHat (before Fedora) over Ubuntu.

>badly interacting upgrades of the OS and separately installed packages and system versions of python/ruby/etc., that I do most of my dev work in Debian in VirtualBox now

Well, sure, but (unless I am mistaken) that is no longer on the topic of the Linux desktop.

I'm experiencing similar frustrations after having returned to using Linux (Ubuntu) at work recently. I've spent a fair bit of time having to customise a window manager (awesomewm), just to get a somewhat usable desktop environment. I work at a university with centralised mail handled by Exchange and struggle every day with flaky, buggy mail clients Evolution/DavMail+Thunderbird. Under linux my macbook air only sleeps reliably half the time, and occasionally I find that unbeknownst to me some process has woken the laptop (with lid closed) in my bag, with the battery near critical. The latest apt-get update broke my sound. Empathy hangs for a good 2-3 minutes on load, but eventually recovers. Something mysteriously reverts my xmodmap keybinds mid-session. Other little things such as LightDM not starting until I hit enter at a black screen post boot contribute to an overall experience that feels frustratingly incomplete.

I know that many of these are trivial complaints and that all of these things can be fixed. I no longer want to have to do this however. In my teens I loved tinkering with the linux desktop, designing 'awesome' Enlightenment themes and enjoyed tracking down and fixing these little problems. Now in my 30s I really just want the OS to get out of the way and let me code. OSX does this well, linux really does not. I say this with a fair bit of sadness as there are many things that I like about linux a great deal - the cli, the excellent package management, the ethos behind the OS and the open source community.

I had nothing but trouble w/ Ubuntu when trying to use anything non-Canonical tested. I suggest using something like Archlinux where you're not forced to stick with buggy software for a 6 month cycle.

And I've been really, really impressed with the latest Thunderbird/Lightning releases (14-15). Haven't had any problems at all, like I had with the older releases from several years ago. I used to be a Mail guy, but Thunderbird blows it out of the water these days.

> Linux can't do this. Or maybe there's been no one with the drive, conviction and gravitas to pull it off? Who knows?

I say Mark Shuttleworth is doing this, and no one is paying attention. Unity was a unified vision, it was just different and no one wanted different. And that is fine for a new target audience he wanted for the OS, which was everyone else in the world. Look at what they are doing now, Ubuntu on tablets, Ubuntu TV, they are pushing a unified vision, and nobody is getting behind it en masse in the Linux community because it isn't everything they wanted. They want Cinnamon, they don't like Gnome 3, where is my Gnome 2, etc. The inherent strife is the proble, not a lack of visionaries.

Yes but most of the people who are "leaving Ubuntu" are just switching the desktop environment or using a system downstream of Ubuntu. It makes for a lot of drama but it doesn't much hurt Ubuntu community support (which is their greatest strength, and why I use Ubuntu) because so many of the packages remain the same.

Two years ago, when it was experiencing a considerable upswing in use, Canonical was in the position to at least partly dictate some standards. And an Ubuntu install also would "just work" (and still does in my experience).

Unfortunately, I think they kind of snatched defeat from the jaws of victory by attempting to go to a "modern user interface". The big problem is that neither designers nor programmers really enjoy incremental progress (designers want a big canvas and programmers want clean code). Gnome2 really was/is good enough for most people use easily AND it was/is simple and powerful not to stand in the way of the existing users (programmers, Linux Geeks, Linus himself,etc). Whether Unity and Gnome 3 work for average people or not, they certainly alienated the existing Linux desktop users and that is an important segment.

Yes, yes that' is it right there, Unity and Gnome 3 both are terrible compared to Gnome 2. I wish the Gnome 2 team would re-materialize and come out with Gnome 4, just Gnome 2 with a higher version number.

Coming from the perspective of a developer:

I used linux as my only OS sans VMs from roughly 2008 till 2011. I finally gave up because it handled monitor switching, multiple monitors so poorly (using Nvidia GPU). I begrudgingly switched to windows because I owned the hardware already, and it was the only option. Windows sucks, the lack of a usable shell kills it, full stop, nothing else windows does matters. And no, cygwin doesn't cut it.

Fast forward to now, the only computer I manually interact with is a macbook air. The old desktop hardware became my file server(running FreeNAS) despite being many times faster than the air.

Through this experience I realized what I actually cared about...I run four 'apps' 99% of the time: chrome, sublime text(everything non java), iTerm and IntelliJ(java). That is really all I care about, anything that prevents interacting with these applications as quickly as possible fails. The support stuff, git, a database, a message queue, etc. Runs anywhere. I _rarely_ interact with the 'OS' in ui terms. I rarely use finder, I only use the dock to empty the trash (launch everything through Quicksilver). At the end of the day I love OS X not because it has 'awesome' UI paradigms, but because of all the options I see it the least, it may as well not be there.

OS X is the only OS i've ever used that I just don't think or care about. It sleeps, it wakes up, it deals with new screens, it does all the OS shit so I can just use the apps i want to. Thats why I am hooked using it as a dev machine.

As for the "lack of a usable shell" part, have you looked at Powershell? It comes bundled with Windows, and has a lot of the features that are missing from the command prompt.

It does have some rough edges that may cause you to dismiss it as unusable, but it offers tab completion, colored text output, and a real scripting environment (not DOS batch scripting). Every serious Windows developer I've talked to nowadays uses it exclusively.

> As for the "lack of a usable shell" part, have you looked at Powershell?

No, and I have zero reason to do so. I don't deploy on windows, I have no reason to ever deploy on windows, so why would I bother learning another syntax just to develop on windows? It really doesn't make any sense. Powershell only makes sense for IT admins whom have to manage windows networks, it makes no sense at all for developers deploying on *nix.

Your needs are strikingly similar to mine, down to the choice of apps and launcher. What I noticed after a week of full time Gnome3 Shell use was that it was similar enough to OSX (but with the niceties of a proper window manager) that I'm finding myself hooked. Makes my ThinkPad a reasonable facsimile of my MacBook from a keyboard-nav centric, "keep out of my way" perspective. I love OSX on the MacBook, but hate it when docked. Gnome Shell gives me a reasonable OSX-like Linux experience on the laptop, but with a great docked experience to match.

Can't speak to the seamless suspend/resume support, etc on your choice of hardware, and I acknowledge that if that kind of stuff doesn't work, it'd be a dealbreaker.

Just saying, Gnome3/Shell is at least worth checking out if you like the OSX workflow and use just a few apps day to day for development.

Thanks for the writeup of your experience!

"using Nvidia GPU"

Yup, still no xrandr support which makes any on-the-fly monitor config changes very difficult or impossible.

Sometimes, if you use twinview and nvidia-settings from the start and if the moon is aligned just right, you can dynamically configure monitors / video ports. Sometimes.

I usually have firefox, a terminal, sublime text, ftp client, irc client and a file browser open. All I use is openbox for a windows manager, tint2 for a task manager and dmenu for a launcher. I very rarely have problems with my linux computer.

Linux on the desktop suffered from weak developer and industry support. It was until some years ago, Linux sound system latency was as powerful as Windows 3.1's. Using proprietary drivers for X or even worse for the Kernel is a mess.

So obvious... Linus promised Linux on the desktop 2000 or so. The reality was different: influential groups just ignored the desktop. Now it's too late, web is becoming the predominant platform, Operating Systems are just commodity. For me it's no big difference whether I use Linux or OS X (with coreutils etc. installed). In fact I wouldn't even mind working on Windows, unfortunately I don't have the patience to setup a reasonable Unix-like dev environment.

The irony is: it has never been less painful to switch to Linux.

> The irony is: it has never been less painful to switch to Linux.

This is so true. Linux has a bad rap. I gave up on it a couple of years ago due to the infamous "hidden cost of linux". I switched back a couple months ago and have been pleasantly surprised how smooth things are now. A lot has improved in the last couple years in Linux land.

>have been pleasantly surprised how smooth things are now

Which version of which distro?

I have two monitors on my Linux desktop. A month ago full screen on video stopped working. Or I guess I should say it moved to the center of the two screens so is unusable. I have no idea why.

Is this in Chrome on sites like YouTube? If so, I believe it started with Chrome 20 when Chrome switched to its native Flash driver (see http://productforums.google.com/forum/#!topic/chrome/Mi-YgjN...). This happened to me too, and I still haven't resolved it.

If that was the only problem though we would have expected FreeBSD to overtake the desktop. the larger issue, unfortunately is just that people are afraid to switch from Windows to anything other than Mac. Android is helping there, but it will take time.

Never underestimate the fear of the consumer that they will break stuff on a computer they are unfamiliar with. The fact that Windows is actually getting better may actually be a good thing regarding other possibilities of desktop software.

As a FreeBSD user since 2007 I found the desktop options to be just as clunky as Linux if not more so. It was the only part of the OS that didn't just work when I used it in my travels, so it's stayed as server duty. If there was an open source windows managing system that was on par with Mac's version...whole new world. (There might be, I havnet' looked since 2009 so)

"Every engineer I know... They also have a distorted view..."

cletus you are one of the few people with enough karma to always grab and hold a top post spot who _actually says things worth saying_.

Right on.

As for "someone with the gravitas", never say never. But they might not use the Linux kernel and GNU as their "clay" or "base material".

I've thought about such a person and one thing I believe is that they would be foolish to try to "sell" such a simple to use system to Linux users (and many of those users would, alas, include engineers like the ones you describe: set in their ways) nor to those engineers who worship Apple, with their belief in some mythical "user experience" [translation: _their_ experience]).

In my opinion, a person with the gravitas would also need to the vision to see that the target user base who is open to change lies elsewhere. The users need to come with an open mind.

I don't know what will happen to Linux. But the binary blob problem keeps getting worse. Too many Linux users are happy to accept the blobs. They don't want to read source code. They just want working devices. That's understandable. But in my opinion it's not a small problem in the long run.

Apple, on the other hand, is flat out abusing its power. Not only over end users but over developers (who are of course just a particular class of end users). They are standing in the way of people's general education about computers. Keeping everyone dumb may give some developers a warm, fuzzy feeling as they watch their bank accounts grow to new levels, or dream of it, but I do think Apple's conduct, seen for what it is, will trickle down past just us fanatics on HN and elsewhere on the web. People are going to figure it out.

If I was a monopoly-lover, if I loved to see "winner-take-all" in IT, as if that improved the lot for any of us (I am not and it doesn't), then my money would go on Amazon. They seem to be making the fewest mistakes. Great things will come from AWS. It is the democratisation of hosting - a sharing of power, risks be damned (unfortunately, it may also mean the monopolisation of hosting as we've never before seen). In "Dropbox", we are only seeing the very beginning of what's possible.

In my view, Apple is a "disabler" (everything they introduce is restricted) while Amazon is an "enabler" (generally: they open many more doors than they close).

> I just want it to work...

We must be doing different things, because I've been on Linux desktops with zero-to-minimal effort since 1999.

Agreed that things should just work. In case you're still looking for a fix for the dual monitor full screen issue. I had the same problem on Chrome. It was when they introduced built-in flash. You need to disable Chrome's built-in flash and everything is back to normal.

Maybe by designing a set of base interfaces, with extension (ala opengl, w3c)

relevant (and funny cause it's true): http://xkcd.com/963/

Nvidia card?

I think some of this is perceptive. It's true that the attempt by both Canonical (Unity) and Red Hat (Gnome 3) to sort-of-incompatibly break away from the so close to standard that it hurts to type this Gnome 2 environment did a lot more harm than good, at least as far as platform adoption goes.

And clearly OS X is an extremely polished Unix and is going to appeal to the more UI-focused of the hacker set. And Miquel is definitely among the most UI-focused of the hacker set. He's also an inconsolate "platform fan". Much of his early work was chasing Microsoft products and technologies, of course; now he's an iPhone nut apparently, and that doesn't really surprise me.

But at the same time the Linux desktop was never really in the game. I use it (Gnome 3 currently) and prefer it. Lots of others do. For many, it really does just work better. But in a world where super-polished products are the norm, a hacker-focused suite of software isn't ever going to amount to more than a curiosity. (And again, I say this as someone who will likely never work in a Windows or OS X desktop.)

So in that light, I think the idea that the Linux desktop got "killed" is sort of missing the point. It's no more moribund now than it was before. It's more fractured in a sense, as the "Gnome" side of the previous desktop war has split into 3+ camps (Unity, Gnome 3 and Gnome2/Xfce, though there are other spliter camps like Mint/Cinnamon too). But it's here and it works, and it's not going anywhere. Try it!

So in that light, I think the idea that the Linux desktop got "killed" is sort of missing the point. It's no more moribund now than it was before.

I strongly disagree. It is losing exactly the sort of person that the author is: developers who, all else equal, would rather use Linux. But who eventually get tired of the BS and just want something that works and you can actually get software for. I have a lot of sympathy for that.

I write this from a Linux laptop, but that's more out of mulish stubbornness and 15 years of accumulated irritation with Steve Jobs and his dickish business practices.

The last time I got a new laptop I knew I didn't have time to screw around for days with X configuration files. And so I paid a vendor several hundred dollars over list to give me a laptop that JFW. And despite that the sound is still way too quiet. After a few time-boxed 2-hour excursions into whatever sound system they're using this week, I still can't fix it. I've given up.

The only legitimate reason I have for staying on desktop Linux is that I code for Linux servers, and I think it's impossible to really understand system performance if you're not running the same OS. But even that seems shaky to me; hardware keeps getting cheaper and developers keep getting more expensive, so it just doesn't matter as much.

One day some bright Linux spark is going to "innovate" again in a way that I'm expected to put up with their rough edges for 6 months (hello, Unity!) and I'm going to say fuck it and buy a Mac because I just don't have time to screw around right then. Or maybe I'll just want to watch a Netflix movie without hassle, or play the video game my pals are all talking about. And maybe by then it will be a fancy dock for my 8-core Android phone.

Overall, I agree with his point, except that I don't think Linux on the desktop is so much dying as cutting its own throat.

When you write "whatever sound system they're using this week", I know what you mean. There's terrible fragmentation, and buggy libraries get rewritten and replaced by some other buggy thing instead of fixed. That's definitely a pattern. From 2007-9 I even fled to OpenBSD on the desktop to get a more stable OS.

But for the past two and a half years I've used Ubuntu with Xfce, and those problems have become a distant memory. Nothing breaks, nothing gets changed on me (Xfce has moved a few things around, but nothing too terrible). No one forced me to switch to Unity, so I didn't. For me, Linux is more usable and stable than it's ever been. I also seem to see more people running it on the desktop than ever before, and in fact stats from Net Applications show a 50% rise in market share in 2011: http://www.zdnet.com/blog/open-source/is-the-linux-desktop-a...

Inside the Apple-centric echo chamber of HN, it's easy to believe that all the developers have moved to OS X and desktop Linux is dead, but I disagree. Despite its problems, desktop Linux is secure in its (small) niche.

Honestly, I like Unity. Or more accurately, I like where they're going with Unity. And I would like the thing itself if it actually worked reliably. But here I am, running the stock version of the biggest Linux distro on popular hardware, and I have to log out and in or reboot roughly daily.

I don't really believe developers are moving to Apple gear because of HN. I believe it because last time I went on a hiring spree, a lot of people said, "Oh, you're running Linux? Do we have to?"

> I like where they're going with Unity

Fair enough, but why not wait to use it until they get there? Daily reliability problems seem like a lot to put up with just for some cool ideas.

Also, I'm not sure how your interviewing experience supports the conclusion "Linux is dead/dying". What it does seem to suggest is "Linux is unpopular", but that has never not been true (on the desktop).

I'm using it because that's part of the default install. At one point I had the time and inclination to spend futzing with stuff to get it to work, but that's not where I'm at. If something isn't ready to use on a desktop OS, they shouldn't ship it.

My previous interviewing experience wasn't like that say, five years ago.

Then you haven't heard of Xubuntu? It's an alternate flavor of Ubuntu that has Xfce as the default install. Here's an iso that will get you basically the exact setup I have, no futzing required: http://xubuntu.org/getxubuntu/

Already installed stock Ubuntu? Install the xubuntu-desktop metapackage (in Synaptic or 'sudo apt-get install xubuntu-desktop'), log out and select "Xubuntu Session" from the login screen. That's it. I did this on my laptop after realizing I didn't want to switch to Unity. (For bonus points you can remove the Unity desktop apps, but that's hardly necessary unless space is at a premium.)

I don't think I'd put up with Linux myself if it was as much of a pain as you seem to think. For me, Xubuntu has been no trouble at all.

What are you using for sound? I use Pulseaudio on my desktop (custom built with a M-Audio Delta 2496 Pro soundcard) and on my laptop (crappy Lenovo G575) and I don't have any problems at all.

All I do is have this line in my .xinitrc:

   [[ -x /usr/bin/start-pulseaudio-x11 ]] && start-pulseaudio-x11 &
And it works perfectly! If I want to change my volume or manage which device it outputs on, I just run pavucontrol

What distro are you using that you are having to start pulseaudio manually? I swear I thought it was default everywhere.

He might be using Arch or Gentoo, or another of the more "DIY" oriented distros. Last time I used Gentoo you still had to really go out of your way to install pulseaudio, though that was four years ago so I don't know if that has changed.

It's Arch, and it's really rather easy to install PA

  # pacman -S pulseaudio pulseaudio-alsa
And if you're ok with having GConf installed (ugh)

  # pacman -S pavucontrol paprefs

I'm using Arch, but this applies to any distro if you don't use a DM and instead use xinit.

I much much prefer the simplicity of only using what I at least partially understand - the more magic (ie a DM) the more shit to go wrong that you don't understand.

It is the default in all mayor DEs, but he might have a custom X session, using a standalone window manager and extra tools. Or he could work in the console.

Many of the problems people have with Linux have nothing to do with Linux itself.

When I first tried to use Linux about 7 years ago, wireless drivers were a huge problem. Manufacturers didn't provide assistance--not much Linux distros could do about it at the time.

These problems are simply inherent to Linux being a minority platform.

> Many of the problems people have with Linux have nothing to do with Linux itself.

I disagree.

What the OP is complaining about is the same thing Miguel was complaining about, and it's the exact same thing JWZ called out 9 years in his CADT rant[1]. It has nothing whatsoever to do with manufacturers failing to provide drivers and everything to do with attention-deficit devs never wanting to knuckle down and do the hard, unglamorous work of long-term maintenance and bug fixing.

Working systems (with known bugs) are thrown out and re-written as new, incompatible systems with even more bugs. Everything breaks every time some idiot decides that they'll rewrite the audio/desktop inter-op/init/logging/whatever subsystem because This Time It'll Be Done Right™. This perpetual treadmill of half-working betas never ends.

It gets old.

[1]: http://www.jwz.org/doc/cadt.html

OpenBSD actually made a huge contribution here by reverse-engineering binary blobs and writing open-source drivers that could be maintained and debugged, which eventually made their way into Linux. At one time, the wireless support on OpenBSD was vastly superior to Linux. (This is a bit of a tangent, but I think props are due.)

> These problems are simply inherent to Linux being a minority platform.

I suspect part of the reason people spend so much time with advocacy is that popularity does pay off, long term: popular platforms get more support, more applications, and more other people who can help you with your own problems.

>So in that light, I think the idea that the Linux desktop got "killed" is sort of missing the point. It's no more moribund now than it was before.

Oh, but it is. Because it has lost a lot of momentum that it had, momentum that was coming from the "we're gonna overtake MS and win over the Desktop" feeling prevalent at the time.

Heck, the guy behind GTK complained recently that he is just one man taking care of the project. The full GUI foundation for Gnome, and one that is far from feature complete at that, and it only has one guy working on it.

It has also lost a lot of people and companies associated with it at the time betting on this possibility [of it winning the desktop]. Most companies nowadays support Linux development only for the server stuff, but it wasn't always so. Miguel moved on, Ximian moved on, Hazel moved on, Rasterman moved on, etc etc. Even Adobe quit developing Flash for it.

And it also lost a lot of "alpha geeks" to OS X, which wasn't even commercially available at the time (1997-2001). This "desktop UNIX" come out of nowhere and it was it that did what Linux was supposed (and expected) to do, ie eat into MS market share. Well, even OS X didn't eat that much, but 15% is still a lot.

>He's also an inconsolate "platform fan". Much of his early work was chasing Microsoft products and technologies, of course; now he's an iPhone nut apparently, and that doesn't really surprise me.

You're saying it like it's a bad thing. Gnome (or Linux for that matter) are also platforms.

And to be frank, his early work was not "chasing Microsoft products and technologies". His early world was Midnight Commander, Gnome, Gnumeric, and Evolution.

He indeed like the component model (not the Window platform itself) when it was presented to him at a Microsoft visit, though (the story of this visit should be up there somewhere). But what's not to like about it? A good, clean, component model of sorts was also needed for FOSS, maybe still is.

The phase you describe, IIRC, was several years _later_, when he saw the .NET platform and got hooked.

I really didn't intend it as a swipe against Miguel, but I think it does inform his perspective. And I meant "platform fan" in the advocacy sense: he has a tendency to "fall in love" with favorite products. That's not uncommon in the general population (it's pretty much the norm at HN!), but it certainly is among core Linux people who tend to prefer doing new things in different ways.

> His early world was Midnight Commander, Gnome, Gnumeric, and Evolution.

Clones of Norton Commander, Windows, Excel, and Outlook. To be fair, Gnome 1 wasn't really a "clone" (though it did mimick more than innovate) and mc was chasing a Symantec product, not a Microsoft one.

But to claim that these were innovative new projects is silly. Miguel's career has been one of seeing something he loves in an existing product and duplicating it in his preferred free software environment. There's no shame there. But it's absolutely the same thinking that drove the Mono project.

Pedantically speaking, I wrote Gnumeric not because I loved Excel, or enjoyed using it or because I was trying to be innovative. In fact, I did not even know how to use a spreadsheet at the time.

Gnumeric was the product of the mood in the early days of Gnome: we need to provide this to have a complete desktop offering and we were talented hackers that could get it done. So I did.

I forgot to mention.

Once I started, I enjoyed writing a spreadsheet every second of it. It was both a very educational process, but also the one that made me grow fonder of strict compiler warnings and errors. And from this experience, most Gnome software after this was compiled with warn-as-error.

Well, I'm glad you did write Gnumeric. The graphs module with that tree like properties box is really nice to use.

Jody Goldberg and the Gnumeric community deserve the credit for the graphics module, not me.

But I am glad you enjoy Gnumeric.

Whoever did the design of the graphics module was cooking on gas. The whole app is light and nice.

>And I meant "platform fan" in the advocacy sense: he has a tendency to "fall in love" with favorite products.

Well, for one, C#/CLR was not a favorite product when he fell in love with them, were they? At the time lots of people thought MS was foolish to try to overtake Java, and it took a couple of years for people to accept and like .NET.

>But to claim that these were innovative new projects is silly.

Sure, they weren't, but it's not like Linux in general has given us much innovative things in the desktop space.

> Sure, they weren't, but it's not like Linux in general has given us much innovative things in the desktop space.


Two off the top of my head: workspaces, app stores.

Workspaces? You mean virtual desktops? Those date back 1986, implemented at Xerox PARC (and patented!).

As for appstores, I assume you're talking about the packaging systems, which are similar only if viewed from 50,000 feet.

The packaging systems were not stores. You couldn't buy packages. They didn't serve as an agent to sell other people's personally submitted packages. They had "apps" but no "store". The applications weren't sandboxed to make it relatively safe for them to sell 3rd-party submitted packages. The applications were expected to be open-source.

The only similarity to the app stores is that they had a repository of software and an automated installation system used to install it.

You got me on workspaces, I wasn't aware of the Xerox implementation.

I didn't mean packaging, I meant app stores, see Lindows and CNR.

Wasn't CNR just an subscription-based package repository service?

No, it was an app store like we know them today. Here's article from the founder (also the founder of MP3.com):


Plus the dependencies. With app stores you (usually) get one program and that's it. It works. With package systems you have to install several dependencies for each, sometimes an absurd number (say, 100 packages to get Gnome). And then there's dependency problems (had my share in Debian at older times).

At the core, dependencies are a good thing, although there's downsides to that as well.

I have no experience with desktop app stores, but you wouldn't get Gnome and similar large and complex software from one, would you? Which makes your comparison pretty unfair.

Well, I got Mountain Lion from the Mac App Store, which is even bigger than Gnome (kernel + userland + stuff).

You can't name innovative things amongst a crowd that rejects that innovation can happen. They'll tell you someone else did it earlier, even when they didn't.

For instance, cars were not an innovation according to hacker news, because Horses!

>Two off the top of my head: workspaces

Much older than Linux.

>app stores

You mean package managers. Similar in functionality (getting programs) but not the same thing. Both have nice things going for themselves though.

I've used Linux for years and have never had these problems. I think the issue is that getting everything working requires a deep understanding of each component and the system as a whole. If you just follow advice on forums, you will make things worse because you're doing things you don't understand to a system that you don't understand. That's not going to lead to success. You need to be able to think critically about what's wrong and what needs to change, and then execute those changes. No, that's probably not worth doing if you already like Windows or OS X. If you don't, though...

(And, there are of course Linux-based systems that were built by someone controlling the whole experience, and those work really well. Android and ChromeOS come to mind, though those aren't really desktops per se.)

The other day, someone here was complaining about udev. It has ruined Linux forever, or something. I have a different experience: udev has made my life very easy. I have a rule for each device I care about, and that device is automatically made available at a fixed location when it is plugged in. For example, I have a rule that detects a microcontroller that is waiting to be programmed with avrdude in avr109 mode that symlinks the raw device (/dev/ttyUSB<whatever>) to /dev/avr109. I then have a script that waits for inotify to detect the symlink, and then call avrdude to program the microcontroller. A few lines of shell scripting (actually, it's in my Makefile), and I can just plug in a microcontroller, press the programming button on it, and everything just works. No screwing around with figuring out which device address it's assigned to. How do you do that in Windows?

I have used Linux as my main desktop and laptop OS for 17 years, and see these problems frequently. It used to be a system I could keep completely in my head, and understand what was happening and why. But from maybe 2005 onwards there has been a persistent and accelerating trend of replacing the old working and transparent (though possibly a bit baroque) infrastructure with fancy new components that are completely inscrutable. And the lack of transparency gets worse and worse the thicker these layers of garbage get. It's mostly fine as long as things work. But things never seem to work if you have a configuration that differs even a bit from the default.

Want to log in from a terminal and start X? Well, too bad there's some new infrastructure this month that makes sure you'll only get access to the sound device when you log in from a properly configured gdm. Want to modify the keyboard layout? Whoops, the xmodmap format that had been stable for a couple of decades now changed for the third time in a year. Want to add a tmpfs on /tmp to fstab? Well, too bad. Some implicit and undebuggable circular dependencies in systemd will make the system unbootable.

And the sad parts are that the problems this new infrastructure is supposed to solve never actually existed. "Great, now audio doesn't work at all. But if it worked, it would have full support for network transparency". It's easy to understand why the problem exists - creating something new tends to be more rewarding than working on the old stuff. It's much harder to see how to fix this madness.

Move to a simpler distribution that won't bother you. I've been on Arch for the last few years because once I set it up, new updates won't add new "functionality". It isn't perfect, but I can keep the entire system in my head. Nothing opaque, nothing inscrutable -- the design philosophy is KISS. It won't hold your hand, but it also won't get in your way.

All the problems with Linux over the last five years can be summed up in one word: Pottering.

PulseAudio doesn't depend on GDM, you just need to create a config for it; What I do (on fedora) is move the default.pa from /var/lib/gdm/.pulse/default.pa to ~/.pulse...

You can't expect everything to work when you tear out components then fail to configure things properly...

I wasn't tearing out components. I was doing everything exactly the way I'd been doing forever, and it no longer worked. Which is bad in itself, but maybe it's understandable that niche usecases break every now and then. It's just that it's happened so often and for so many parts of the system that it's hard for at least me to think it's isolated incidents rather than a cultural issue.

The truly toxic part is that every single transition adds complexity and reduces transparency, making it harder and harder to understand the system. It's just not the breakage alone, or the complexity alone, or even the lack of transparency. It's the combination of all of those.

In the start of this thread jrockway proudly says that all you need is deep understanding all the components and the system as a whole. Back in the day this was not actually an unreasonable thing. But it's been getting less and less reasonable for a long time.

(Incidentally the audio example in my original message wasn't even directly related to pulseaudio. It was a few years back, but IIRC it was some daemon tweaking the device permissions, and something else adding users to a special group in the GDM login path but not the console one.)

Well, GDM tends to handle a lot of initialization that it shouldn't be the one doing, using a distribution that doesn't assume you're using GNOME may help with this.

About the transitions supposedly adding complexity; I don't know about you, but systemd, for example, has greatly simplified configuration and management of services, mountpoints, timers, and all sorts of things.

I use awesome in Fedora 17 on some early-2010 entry-level consumer intel hardware, everything is zippy, easy to configure, and doesn't break on me all the time; I feel bad for you man.

It sounds like you're agreeing with the article - things work on Linux when you have a deep understanding of them. While it's good to encourage a deep understanding, the group of people using any given OS will always follow a curve of some sort. There will always be beginners; there will always be people who know just enough to be dangerous; there will always be experts.

Requiring everyone to be an expert will prevent Linux from being a dominant OS, because most people do not care enough to gain a deep understanding of any OS, so they'll pick one that's easier to learn the easy bits.

You'll hit the same problem with any OS. OS X will just get slower and slower, and Windows will become infected with malware. General purpose computers are not easy to use, yet: with the infinite flexibility they provide, there's infinite opportunity to fubar them. The average Linux distributions exposes the flexibility by default, making it seem hard to use. But if you switch to a less-flexible Linux, like ChromeOS, many of the problems go away. So I don't really understand what the article is trying to achieve other than trolling by stating the obvious in an inflammatory manner.

"I've used Linux for years and have never had these problems."

Agreed. That is two desktops and four laptops worth in a decade.

"I think the issue is that getting everything working requires a deep understanding of each component and the system as a whole. "

Not so sure. I'm an end user and tend to just shove the CD-ROM in and cross my fingers. Most 'stuff' works. My stuff is simpler than yours however (audio interface, cameras, keyboard (musical) controller)

IMHO; Android and to a lesser degree ChromeOS (as well as other similar platforms) don't quite count as "Linux" for purposes of "the Linux desktop" as they've replaced nearly everything above the kernel with their own stack[0].

[0] Last time I used ChromeOS it was still mostly in line with ordinary Linux (thought it might have been partially Gentoo based?); but a rather impractical one as it's less than simple to run [things that aren't Chrome]. Heard about Xorg getting replaced at some point, I'm unclear on if that actually happened.

So... one of his key problems is that when (not if) things break, you need to learn more than you ever wanted to know about the low-level internals of an OS to fix it. And you say you've never had that had that problem; you should have a deep understanding of each component and the system as a whole in order to be able to use Linux.

But that's exactly one of the major problems he's complaining about!

It's a different philosophy. Some of us come at it from the perspective that we want to know how our machine works. And when something goes wrong, we want to dig into it and fix it ourselves. The monolithic and opaque OSs that "just work" aren't for us. Oddly, the only place I've seen this attitude described/explained is in Zen and the Art of Motorcycle Maintenance.

CLOSED: Works on my computer

JWZ identified the issue Miguel discusses in this post ten years ago, he even gave it a name: CADT


Also, part of what killed the Linux desktop was Miguel and his total lack of understanding of the unix philosophy which drove him to create abominations like BONOBO. D-Bus is not much better either.

That he fell in love with an iPhone goes to show he didn't fully appreciate the value of open source either.

We were just yesterday commenting with some friends in #cat-v how Evolution is one of the worst pieces of software ever created, and Evolution is supposedly considered by Miguel and co to be the epitome of the Linux desktop.

How does falling in love with an iPhone show that you don't fully appreciate the value of open source? Tell me about any good open source handset that came out before the iPhone. OpenMoko? I mean, you really had to love open source to put up with it.

Yes, there is still Android but, given how modified the usual handset is, you can't buy a free and open Android on the market as well.

So, where is the problem in using a closed device when "open" just doesn't deliver?

There is a big difference between using and loving.

Which leaves your point unexplained. So, let me reply in an equally enigmatic fashion:

There are many reasons for which a thing can be loved.

That you can't understand a love affair with the iPhone just means you don't appreciate good design.

I'm not a fanboy but I did envy my wife's iPhone for years. I like my Android better today but what choices did you really have when the iPhone came out? It was the only game in town. I was chained to a Blackberry is the only reason I didn't have one.

Thanks for the link!

Speaking as someone running a Linux desktop (and am writing this on one) there's not much to say other than I agree. I run linux because work gave me a PC and there's no way I can write software on Windows. Of course we all have servers managed off in the corporate cloud somewhere that run ssh/vnc etc, but there's no way I wanted to install putty again or miss out on the unix commands that make (work) life more enjoyable, so I installed Linux, because I write server software, and client sometimes, but browsers make the operating system moot pretty much. There's more variation between browsers than between operating systems - mobile aside. And when I need to try something on Wintel I spin up a cloud instance and use vnc.

When i'm not on Linux I run OSX everywhere else (and IOS) because its unix-like (is) and because it works so well. I am sure Windows 7 and 8 are great, but I doubt they have gotten rid of c: or \ as path delimiter or any of the other nonsense that DOS introduced (copied from PIP) back in the dark ages. why should they, MSFT still runs DOS apps so they aren't going to change and choosing between OSX and Linux on a non-work desktop is a no-brainer, Netflix, Photoshop etc etc etc...

/ works as a path delimiter on Windows. And Linux runs DOS apps just fine too, in essentially the same manner that modern Windows does. (Emulation.)

Why is / inherently better as a path delimiter than \ ? DOS was built to be compatible with VMS and other DEC OS's, which used / as a switch character.

If you could arbitrarily pick any character, both are as good as each other. However '\' is currently an escaping character in nearly every program that needs such a function, so as things currently are, '/' is better. This begs the question of when '\' as an escape character became mainstream though.

Because nearly every programming language uses \ as an escape, which means in string literals that you have to double escape the \ which is error prone and ugly.

you took the words right out of my mouth - ahh putty what fun :)

I think the article does a great job of explaining the problem, but doesn't explore the ramifications far enough.

Let me give an example: a few months ago, a new version of Skype was announced for Linux. I was excited, since I used Skype 2 for Linux but then it stopped working for me and I couldn't be bothered to fix it. But if you go to the Skype for Linux download page, you will find a few downloads for specific distros, then some tar files which are, statistically speaking, guaranteed not to work.

Long story short(er), I still don't have Skype working on my desktop, because my distro isn't in the list, I can't get one of the other distro packages to work on my system, and of course none of the statically-linked binaries work.

(I could almost certainly get it to work if I was willing to install 32-bit binary support. But it's 2012. If your app requires me to install 32-bit binary support, I don't need your app that badly.)

Steam for Linux, recently announced by Valve, will run into the same problem. I suspect it will actually be Steam for Ubuntu and Debian, possibly with a version for Fedora, assuming you have the proper libraries installed and are using the right sound daemon and graphical environment.

But if big-name software comes out for Linux, hopefully distros will get in line. Do you want to be that distro which can't run Steam? Doesn't really matter if you think that OSSv4 is superior to ALSA and PulseAudio...if Steam requires the latter, you will toe the freaking line, or disappear into obsolescence.

What distro are you running? It must be sort of out there that you're having such difficulties. I'm running Arch which is not widely supported, but user submitted build recipes (think brew on OSX but with way more options) work 95% of the time (e.g., no trouble with Skype, Spotify, etc.)

I'm running Arch as well, though I don't use the AUR. Stuff tends to crustify too much and too easily in there.

Once you install yaourt or packer though it's really easy to at least see if the AUR package works (Skype works fine, e.g.). If not then maybe you're on your own, but if you're having trouble it can still be useful to see how someone else built it in the past.

I'd agree with a whole lot of what's said here, but also add:

One of the big thrusts of the Linux desktop wasn't simply dominance itself, but for it to simply not matter what you were using on the desktop. The Linux desktop fought to produce the first cracks in Windows hegemony a decade ago, but the final push came from the rebirth of Apple and the rise of the smartphone.

Today people happily do their normal productive or recreational tasks from a variety of computing environments: Windows, GNOME, Unity, KDE, OS X, iOS, Android, et al. Probably the majority of (Western) web users use at least one non-Windows internet device.

During the golden age of the Linux desktop everything seemed predicated on reaching exactly this point -- that you wouldn't need Windows, and then, by virtue of competing on a leveler playing field, the Linux desktop would ascend.

But the Linux desktops didn't "scate where the puck is going" -- or their attempts at such missed the mark. By the time we reached the era post-Windows dominance, the Linux desktops weren't positioned to take advantage of the new playing field dynamics. The rest of the industry isn't even all that concerned with the desktop wars anymore. It stopped mattering very much -- and ironically, that came around to bite the projects in the ass that first got the ball rolling.

I never understood this. Why would market share of Linux on the desktop matter? I've always viewed Linux on the desktop as something for power users and developers, and thousands of said power users and developers are continually developing and maintaining multiple distros and thousands of applications. It's not like it's a stale and abandoned paradigm that's left to die.

Was gonna say exactly the same. People thinks that if it's not popular among the masses it's "dead". Totally wrong, since it covers a completely different niche.

The author seems to forget that some people actually enjoys configuring and hacking their systems in detail. Also there is the people who hates using the mouse, and wants to do everything with the console and keyboard.

I think it's because even power users enjoy doing "regular user" stuff sometimes and it's nice to be able to do it all on one machine without resorting to dual boots or VMs.

Having a bigger market share means that hardware/software vendors are more likely to consider supporting Linux to at least some degree.

For example I like to listen to music while programming, so it's nice that Spotify is available on my dev machine.

Linux by no means should be limited to power users. It should be (and is) a universal OS.

> (b) incompatibility across Linux distributions.

This is completely missing the point - a statically compiled end-user binary should be compatible across all distributions of Linux, using the same version of the kernel or any newer version.

The only caveats to that are (a) hardware and (b) poorly-packaged software.

(A) is the fault of hardware manufacturers and is increasingly not an issue these days anyway; driver issues are becoming increasingly rare.

(B) is easy to solve for any open-source software, as it is the responsibility of the community for that distribution to provide the appropriate packaging. They prefer to do it themselves. And they're good at it - it gets done!

If you want to ship a closed-source binary on Linux, just make sure you don't dynamically link it against any libraries that you don't also ship with the binary. Problem solved.

Honestly, I can't remember one single instance ever where I have run into end-user software that will run on one distribution of Linux and not another, as long as that principle was followed.

Many sophisticated libraries on Linux uses dynamic modules or require components that are configured as part of the system.

Consider D-Bus, if you statically link, but the system changes the format or location of D-BUs configuration files, all of a sudden your app no longer works.

So in theory, yes, this could solve some of the problems. But it requires a massive effort to make the Linux desktop libraries static-linking friendly, which they are not.

Like libc, or libX11 and apparently ld-linux.

Why not just ship a chroot jail to run it in in case some of those statically linked system libraries read config files which might be under a different path or in a different format?

Software compatibility in OS X?

A lot of applications break on newer versions of Mac OS X. That's why there are websites like http://roaringapps.com/apps:table

Also, there are a lot of "transitions" that Apple loves doing: PowerPC -> Intel. Java -> Objective-C. Carbon -> Cocoa. 32-bit > 64-bit. Access everything -> Sandbox.

See also Cocoa docs: "method X introduced in 10.5. Deprecated in 10.6".

I have a few devices that don't work in 10.8.

Basically, what I'm saying is that OS X is a bad example for backward compatibility. Windows is much better at this. Open source software is much better at this.

What killed it is that it didn't have a huge and multi-billion dollar company betting on it (on the desktop) like Microsoft and Apple had, even Apple with its billions is still around 5% market share worldwide so having 1% is still a great accomplishment when you think that it had no support from huge corporations.

Now take the mobile world for example, Linux on mobile had been around for a decade but it never really took off until a huge company like Google decided to throw its billions of dollars and its great ingenuity at the task. Getting an OS to be popular is just incredibly difficult and it needs way more than just good driver support and/or good software. It needs marketing, talking to manufacturers, dedicated and well payed devs, designers, UI and UX professionals, sales, R&D and so on and so forth.

Focusing on the technicality of drivers and API is typical of us devs, but it has nothing to do with why Linux didn't take off on the desktop, sure Linux did fail because it couldn't do any or some of that well, but why couldn't it do any or some of that? Because it didn't have a huge and focused company pushing for it. How many popular desktop OS are there? Only 2, I think that's enough to show that it's incredibly hard to get into that market and that only a huge company can make it. Also, let's not forget that Windows was good enough and there was not much Linux could do to attract users, in fact this is still true and probably why even OS X is still at 5%: Windows is good enough and it's the de facto standard used by +90%. Having the best UI and UX in the world like OS X doesn't help that much either.

Redhat and Canonical are big companies that did bet the house on Linux. A few years back I was confident Canonical was what desktop Linux needed to get its shit together. And today I'm typing this on a Zareason laptop that's supposed to provide 1st class support for Ubuntu (even has an Ubuntu key), and the mic doesn't work, Fn keys don't work, youtube videos appear in monochrome blue, and it likes to boot itself up randomly from time until the battery drains...

> Redhat and Canonical are big companies that did bet the house on Linux.

No, RedHat bet the house on server Linux, never on desktop. Canonical did bet the house on desktop linux bit it's a tiny company that has yet to turn a profit (or a significant one).

To fix the blue youtube videos: Create a file /etc/adobe/mms.cfg and put the following 2 lines in it:



Some people will probably tell you all kind of reasons why this is bad, but it works.

You're one user. I've been using the same System76 laptop at work and at home for 3+ years. Still works flawlessly (fn keys and all).

I'm typing this on my work laptop running a linux desktop (Ubuntu FWIIW). Our engineering servers at work run linux and, as a convenience, have the desktop installed. As many of my co-workers run linux desktops as OS-X desktops (and the engineers running OS-X or Windows have VMs running linux... desktops).

When I go home, I'll be using my personal laptop running linux. My wife and kids run a netbook with a linux desktop.

The linux desktop may be dead to Miguel, but it works just fine for me, a lot of other people in my life, and a lot of people in the world.


Optimistically speaking, i moved from mac/pc to linux 9 years ago on desktop dragging dozens of people with me. My experience so far is everything gets better year by year. Yes, Apple took away some shine out of Linux since a couple of years, but I see this as a gain, because those people will be unixified and will be better adapted to Linux setups in the next decades.

I never consider OS/X despite some fanciness; it is limited in choice of hardware, it is totally dead on server side, its unix-ness was/is crappy, despite being developed by the (the most valuable) cathedral in the world. its support for open source dev tools is miserable, despite some recent improvements, its game support is miserable (compared to windows) and it goes worse in openness day by day. Just a crowd of ordinary users, waiting to be sold trivial app store apps maybe an appeal to some developers, but OS/X appears like a toy casio personal organizer OS from 80s to the DIY/Linux developer desktop users.

Yeah, right, because OSX cares so much about backwards compatibility. They care so much that they actively go out and intentionally break APIs, like say when CGDisplayBaseAddress() stopped working in Lion, breaking fullscreen in every single SDL-based game (and by "breaking", I mean the game will actually crash when attempting to enter fullscreen.)

Arguing about the niceties of the UI is all well and good but the actually problem is far more fundamental.

What killed the linux desktop? Drivers. Mostly graphics drivers but some others as well. Who cares if the UI isn't ideal if the damn thing can't sleep and wake up properly, or if it spazs out every time I plug in an external monitor.

Hmm.. Drivers seem to work just fine here. Actually up until now I didn't even notice the Linux desktop's supposed to be dead. Looking at it right now it seems very much alive.

Stupid jokes aside, I always had a feeling that the Linux desktop was actually gaining users in the last couple of years, especially since Ubuntu came along and truth be told it's still the best operating system I've been using but that question is largely open to taste anyway.

For the last couple o' weeks I had to use Windows 7 at work and oh my god, what a pain in the bottom it is! I've had a couple of looks at OSX and never liked it either. Then again my linux desktop isn't exactly standard either - but that's what I like about it: customizability. I'm the kind of guy who likes to tinker with the system until it's just about right for me. The last time I set up my laptop it took me 3 or 4 days to finish but it absolutely paid off. The result is something I find aesthetically pleasing and extremely usable.

Linux has a couple of show-off features and facts 'n' figures like running most of the Top 500 and second-to-none package management but it's real power is that it can very well be all things to all people. It's been said a million times but Linux gives the user a choice, and I honestly value that a million times more than any nice OSXesque UX or Windows-esque games support.

The catch is that I'm definitely not the average user who wants things to "just work" (although they rarely do).

That's exactly what TFA says! Indeed, it points to something even more fundamental: the APIs (both userland and kernel driver space) move too quickly and are fragmented. That's why both drivers are difficult and why desktop apps are difficult to support.

>What killed the linux desktop? Drivers.

That seems a weird way to phrase it. This is not a new thing, and generally drivers have gotten better over the years. Saying that drivers inhibited the growth of Linux makes sense, saying that they killed it -- not so much.

absolutely agree. I love linux, its a great os but drivers are a nightmare, honestly, I've never had any linux distro, where I could comfortably buy an IO device and not have to worry about compatability issues. This to me is the biggest problem by far with the os(s) in their current state.

What do you mean by 'an IO device'? I can count on one hand the number of driver issues I've had, and none of them qualify as IO devices.

What other devices you got?

monitors/graphics cards,printers,modems,network cards,joysticks,etc.

Are you using really old hardware? I've never had any problems with any of the above. Monitors should be plug-and-play if you're using a modern DE (and if you're not, you probably know how to set them up). If you have separate graphics card, you may have issues with battery life, but that depends on the model, and they still work.

I can sympathise with you about external monitor issues. In my office, we all have laptops with external monitors. When we have meetings, all the Windows/OSX users simply unplug the external monitor and take their laptops into the meeting room. The linux users normally just take a pad and pen, as it's too risky to attempt unplugging the external monitor while linux is running.

When have you last used Linux? Drivers haven't been an issue anymore for years now.

I don't actually see any problems with Linux. It's just not meant to be.

As a desktop OS, tried it again 3 weeks ago. Fairly standard setup, Asus motherboard, core i7 3770, 32GB of ram, one dell display@ 2560x1440 one at 1920x1200 in portrait, base AMD 7xxx series gpu, newest ubuntu with AMD binary drivers. First the graphics driver locked up on rotating a screen, that required reboot, even switching to a non graphical tty didn't work, then it took but z-sync was completely disabled on the portrait display and by completely i mean it was unusable with 2-3 inch tears on moving content. This of course is a known issue with Xorg being archaic.

I'm sure you'll blame this on AMD's drivers or shoot out some other random technical reason. While i understand the technical issues, I just don't care, I have far better things to do than care about this shit. It flat out doesn't work and I don't care why.

If you don't care and aren't willing to listen, you cannot have a well-informed opinion and cannot pretend you do.

I think you've completely missed my point.

I spent 5 years using linux as my only OS, I am very well aware of the issues it has, specifically in the area of display management. The reason I stopped using it is that I don't care to mess with it any longer. I've spent hours mucking with the internals of Xorg and writing sleep/hibernate scripts in an attempt to duct tape something together that works. I've spent hours trying different combinations of kernel + Xorg + driver to figure out which one won t randomly peg a CPU core at 100%. I've spent hours researching parts to look for compatibility, I've wasted $ on parts just to work around issues with linux.

What I realized and why I have no interest in linux as a usable desktop anymore is that while this time spent may scratch a nerd itch, it is actually a complete waste of my time and detracts from actually doing something productive, like using the computer to produce, the very reason I own it.

So I don't give a shit at all why it doesn't work anymore. If it doesn't work, it doesn't work, I don't need to do massive research to say "it doesn't work". I don't need to be "well-informed" to say it doesn't work, not working is pretty obvious. In fact doing research goes completely against my goal, which is to spend the least possible amount of time thinking about my computer and the most possible time producing.

I used to be really into the whole free software thing, but have mellowed with age.

However, no way in hell anyone will get me to switch to Mac OS. I am simply too enamored with having an environment that I can hack on if it strikes my fancy, as well as an environment that I can customize how I want it. Despite all its flaws, it still does focus follows mouse pretty well, and not having that would drive me batty.

Also, Apple is an 800 pound gorilla that has always been about Being In Control. The Samsung lawsuit wasn't anything new:


I just don't want to be part of that kind of walled garden.

I'm writing this from my laptop which is running Ubuntu as its desktop.

I don't really see how the Linux desktop is dead. I've been running the same OS on this same laptop since 2007. The only upgrade I've added is an SSD and an extra gig of memory. It's still pretty speedy and I've never had any problems.

I use a Macbook Pro with OS X at work because that's just what I was issued by default. I hate it. I hate the over-reliance on the mouse, on gestures, the abundant and tedious animations; I hate the crappy ecosystem of package repositories and how most of the packages are broken or completely mess with the system; I hate never being able to find where any of the configuration files are or where something is installed; I hate the plethora of ways you can start and stop services; the confusing GUI; the masochistic meta-key layout; the awful full-screen support; and the complete lack of customization options.

I've had much better experiences with the Linux desktop for 95% of the things I do.

Now before some OS X fan-person decides to point out how woefully misguided and ignorant I am, my point is that there are different folks out there who want different things from their desktop experience. Apple gets to decide top-down what that experience is all the way down to the hardware. I prefer a little more flexibility. I like being able to swap out my own battery or adding a new memory module when I need one. I like being able to switch from a GUI desktop to a tiled window manager. Some folks don't -- there are Linux distros that hide as much of that as possible. Either way there are plenty of options and I think that's a good thing. Competition breeds innovation and even though I don't particularly like Unity I am glad to see people trying new things.

The Linux desktop isn't dead. It may just smell funny. You may switch to OS X and wonder why anyone could possibly want anything else. I just gave you a bunch of answers.

There's room for many approaches, of course. While the perfectionism (or is it lack of pragmatism?) of Linux and its developers may well have held back its wider adoption on the desktop, there's a lot to be said for the its development community's single-minded pursuit of quality and correctness.

As well as Linux's presence in the data centre, witness the success of 'embedded' Linux: many TVs, routers, set top boxes and other bits of sealed-box electronics all run on it. It's broad in its scope because of the large team of divergent interests working on it, and it's able to support those systems because it's been well made as a direct result of that team's philosophy. Is it really so bad that the average Facebooker does't want to use it?

It really is very, very hard indeed to be all things to all men and no single system around today can make that claim. Linux has its place in the world of computing, just like Android, Windows, OSX and everything else.

There never was a "Linux desktop". Linux is a kernel. GNU is a set of utilities. And X11 is a mess.

Did you know that X11 is why we have shared libs (the UNIX version of "dll hell")? If not for having to run X11, shared libs really would not have been needed.

There are many window managers. Maybe too many. Too much choice for a noob. That selection or the pre-selections Linux distribution people make does not equate to "the" Linux Desktop. It equates someone else's configurations and choice of applications. It equates to having to fiddle with X11, whether you are just configuring it or developing programs to run in it. And that has always been extremely frustrating for too many people- constant tweaking; it never ends. This is like a brick wall to people who might want to try Linux, coming from Windows. You are inheriting a system that's been configured to someone else's preferences. (Same is true with Apple, but they have a knack for making things easy.)

I skipped Linux altogther and went from using Windows to using BSD. I've also been a Mac user. And BSD is way better than OSX, or any of the previous MacOS's for doing most everyday things: email, internet and secure web (ramdisk). Moreover it's flexible - you can shape into what you want - without this being an overwhelming task of undoing someone else's settings.

If you want a citation for the shared libs thing I will track it down, but honestly anyone can do it on their own. The historical research will do you good. Educate yourself.

An interesting observation is that tablets are becoming the new desktop and in that space linux, through android, is becoming a dominant player. In a way, the linux desktop is finally here and it's winning against both Microsoft and Apple put together.

All of the article's criticism of mainstream workstation distributions is accurate, of course. But it's important to note that those represent nowhere near the sum total of the linux user experience these days.

In no way is that "the Linux desktop" "winning" at anything, because (at long last) the navel-gazing of the Linux community has been pushed aside in favor of one group saying "this is how it will be" (and then some OEMs scribbling a little on the walls, but not much).

Android is only "Linux" when it's convenient for Linux advocates, but it's never "the Linux desktop".

> In my opinion, the problem with Linux on the Desktop is rooted in the developer culture that was created around it.

This developer culture DEFINES Linux. A fruit is either an apple or an orange. I couldn't have an OS with wonderful package management, developer tools, endless configurability AND a desktop Miguel de Icaza dreams of.

But I can have an OS with wonderful package management, developer tools, endless configurability AND a desktop Aaron Seigo and friends created.

The desktop wouldn't be configurable. Or not much more than Channing the background image.

This flame ignites periodically, and I'm always left wondering when exactly the Linux desktop died? Some have noted similar aspects already, but here's my 2 cents:

I'm on Linux now (GNU/Linux, maybe lump BSD in there too, I'm using "Linux"). I know plenty of users on Linux. I know plenty of users of Windows and OS X who run virtual Linux Desktop distributions for testing/development/security. I'm sure some of HN are running Linux.

Does Linux have the potential to enter the market as a third core option for desktop usage - not really. But why does it matter?

The problem with Linux is that there are too many choices. People who like technical choices and options trend toward Linux (needs citation).

John Q. ComputerUser isn't going to use Linux unless his geeky son or nephew installs it for him AND provides support. He can't get support anywhere else - because there are too many possibilities for it to be fiscally effective.

If/When something gets confusing or broken on Windows/OS X, you call JoeBob's SuperDuperPuter, and say it's broken. JoeBob asks, "What Windows version?" While he might need to poke and pry a bit to get the user to tell him he's running Millenium edition, once he gets that data, it's a pretty straightforward troubleshooting effort and fix.

If you call some mythical Computer Service group that actually supports Linux, and say your machine is broken, they would need to know a LOT more about your system just to figure what they need to do to start.

Distribution? Parent Distribution? Shell? Window Manager? Hardware? ...

I find generic computer service companies to be extremely expensive. To be able to provide even basic service for Linux in general, your techs need to be very familiar with more operating systems (emerge, apt, yum, zypper, pacman), and more core applications. Each service effort inherently takes longer. These factors pile up and everything becomes necessarily more expensive. It's downright impractical to support Linux generically. The support costs for one or two issues on Linux would far outweigh the cost of an upfront OS license and cheaper support for the end user.

Linux has (and will likely continue to have) a comfortable hold on the technically-capable DIY market. It may not be on track to step beyond that market in the desktop arena - but that certainly doesn't indicate it's time for a toe tag.

As someone who previously worked doing Windows support, actual "troubleshooting" on desktops is pretty rare apart from a handful of common cases.

Most of the time all that happens is virus scan -> backup -> reinstall.

Exactly - that's much more succinct than I put it. Comparatively, it's not often that a Linux issue is solved by those same steps (or any given set of steps - there's just more variation).

>And you can still run your old OSX apps on Mountain Lion.

Having been a small-scale Mac developer for many years, that really made me chuckle. Not since OS X 10.2 did Apple release a major upgrade that didn't break my apps and make me struggle to push an update out as quickly as possible to fix all the things that Apple broke. Apple has heard of deprecation, but they don't seem to really grok the concept.

If I had been developing for Linux, I could have simply tested on pre-release versions of the distros I wanted to support and would have been ready when the new versions were released. On OS X I would have had to have paid a prohibitive fee for that privilege.

In any case, this article made me happy. You see, for so many years, I used a Mac, and everybody said "Apple is on its last legs; the Mac will be dead in a few years". Apple had to scramble to compete, and that drove them to provide such a good product. But I knew that situation might not last forever, and I was right. After seeing the turn that Apple had taken over the last few years, I switched to an Ubuntu laptop six months ago.

It's refreshing, once again, to be using an OS that people are calling "dead".

Linux is too hard to configure; if the distro gets it right out of the box it's fine, but not otherwise. I started with Windows 3.1 in 1995, mostly used Slackware, and some Windows 95, from 1996 to 2000. Slackware and Windows 98 from 2000 to 2004. But from the time I got on the Internet in 2004 to the present I have mostly used Windows (98, XP, and Vista) because I have not managed to get any version of Linux that I have tried to connect through a dial-up modem. I have to admit I have only tried sporadically, since Windows just works, and my efforts to get some Linux distro to work have been so frustrating. (Note that though a frequent user, I am not a programmer or professional sys-admin.)

ADDED: jrockaway's comment, added while I was writing this, hits it just right: "I think the issue is that getting everything working requires a deep understanding of each component and the system as a whole." Which is what makes it so frustrating, even to very intelligent people who have other interests than computers in and of themselves.

Please ignore Icaza. As for the famous death of Linux on the desktop let me tell you something: IT NEVER HAPPENED. What are those people smoking?

I've been using Linux for the last decade and every year it gets better, more polished, more integrated, featuring a better design; I hear more & more people talking about it and using it. Linux is more alive than ever on the desktop!

Depending on your needs, Linux can make an exceptional desktop. Yes, true, it is not for _everyone_, but then again neither are Windows or MacosX.

The Linux desktop never did any better -- RELATIVE to the rest of the market -- than Ximian Desktop on Red Hat 6.2. If de Icaza wants to blame someone for the failure of the Linux desktop, I would point the finger back at him. He's the one that dropped out of the effort, became a Microsoft shill, and continues to push (the speciously patented) Mono to this day. Also, I don't know why he's lamenting a strong leader casting a clear vision on the desktop these days. Canonical is doing EXACTLY that.

Nothing "killed the Linux desktop"; it still thrives for those that want it, and it's steadily improving. It never came to dominate the market, and one can argue about the reasons it never displaced Macintosh (still less Windows). It probably has a lot to do with lack of a single, unified vision, and the market fragmentation caused by the different distros, and the lack of market pressure to ship, as it relies on volunteer labor, but I'm not going to presume.

Personally, I've been primarily a Mac user since the Mississippian superperiod, but I used an X-11 Windows(™) environment (on top of FreeBSD) for years at work. I don't miss it, even one iota, but I know plenty of smart people who prefer that sort of thing. De gustibus non disputandum est and all that.

When the iPhone 3GS came out, battery life tanked on my 3G. They fixed that, after some period, only to break the reporting in the firmware. Now the device thinks it's dead after a few hours. Replace the battery, same life time. At this point, I'll never expect and apple device to last longer than a year and therefore will not buy one.

Additionally, OSX is no linux replacement. Bash is completely different except for cd, rm, and ls.

funny; I am using a linux desktop right now. not dead yet.

I am using it too, because I like it. I didn't like Unity nor Gnome Shell 3 though; and as Gnome Classic lost support it became uglier. But now I found Cinnamon to be a better alternative to anything else. I am very happy with Linux on the desktop right now; I don't think I'd be happier with a Mac or with Windows, not at all.

by choice though, or out of work necessity?

I use Linux as a desktop OS out of choice. Originally, no - because I couldn't afford a MacBook, but I've learned to love it and I don't have any need to switch.

Linux desktop is an OS for people who are willing to invest time to get an utterly fantastic, fascinating experience. It doesn't suit modern instant-gratification culture, no, and yes, it takes experience and expertise to get the most out of it, but once you get there, it's bloody amazing.

I'd recommend having it as a small hobby as well as tool, so you can spend time learning and figuring things out and broadening your knowledge.

Disclaimer: Using Arch Linux with the i3 window manager. No, it's not for everyone, but I absolutely love it.

By choice here and have been using it for about 6 years now on the desktop and I have no plans of migrating anywhere else.

I have a Linux desktop at work and one at home. I actually see the modern Linux desktop as absolutely amazing compared to where it was a decade ago.

I also have a Mountain Lion MBP, but AFAICT, I will be leaving the Apple cart after it dies and I'll be 100% Linux.

Do you really think there are no people that use Linux by choice?

Choice here too - have been running Linux at home for almost 10 years and I never regretted it. There are almost no downsides, except careful choosing what hardware to buy and the lack of Photoshop (though Wine helps with that now).

I use Windows out of necessity at home to play my video games. I use Linux as choice as my IDE at work.


I need it for work, and I wouldn't have it any other way.

actually both. home and work. my children's friends think I am a hacker since my console looks like a hacker's computer.

Not this post again. Those who thought Linux can compete with heavily subsidized windows on Laptops or OSX with Apple's flashy interfaces are dreamers.

Linux has been for those that like to get dirty and it is doing that job quite well. Canonical came a bit late to the party and wasn't large enough to matter. RHEL just went after the servers. To make a fair comparison, Linux should have had a big player backing it strongly on the Desktops / laptops 10-15 years ago (like Google is doing now with Android). HP and IBM did their half assed attempts, but they were never really behind it completely.

I love Linux, and as a developer, use it as my main os (ubuntu). It is so easy to develop on, and it's package management is superb. I don't use the desktop per se, that much, and am usually command-line driven.

I have a Mac, and use it for some things, at times. It's nice, for sure, but I love the openness of Linux, even though, of course, there can be many very painful hardware issues (video, sound, etc), all of which I have experienced at one time or another.

I am wondering - I hear Google is working on a "Android desktop". Would that perhaps maybe change things regarding the "Linux desktop" a bit?

Android desktop might be excellent. However alot of the Android stack is not open source or free software.

An Android desktop will be a proprietary desktop built on top of an open source kernel.

It might be terrific, but it won't fulfil the dream of a free software OS and desktop.

Back in the day, before setting up Linux was a breeze, I got tired of mucking around with configuration and such just to get a usable Unixy desktop and environment. So the day OS X Jaguar was released I purchased a Mac.

Now if I need to fire up Linux for a project, (usually for a microcontoller or such hardware that needs C), a virtual machine or appliance that I can launch from Windows 7 does the job. This is also how I keep Windows 8 contained, safely in a virtualized box that I don't have to deal with it, unless I need too... ;)

The Linux desktop never got killed! It never was really living. As long as Linux is not sold on computers, it will never spread. Maybe things change in the future, Canonical is doing an insanely great Job bringing Linux to the masses. But personally I think Linux will take of in new markets (China, Brazil, etc), not in allready established ones.

By the way, in my oppinion only a small fraction is buying Macs because of OS X, it's the Hardware. Design and Usability of Ubuntu is a lot better than OS X at the moment.

First of all, the Linux desktop is not dead.

As I wrote on my blog recently:

"In the [past three years], Linux has grown — albeit slowly — in desktop usage. After nearly 2 years of no growth (2008-2010, lingering around 1% of market), in 2011 Linux saw a significant uptick in desktop adoption (+64% from May 2011 to January 2012). However, Linux’s desktop share still about 1/5 of the share of Apple OS X and 1/50 the share of Microsoft Windows. This despite the fact that Linux continues to dominate Microsoft in the server market."

It may be in third place in a desktop market with primarily three OSes, but usage has never been higher.

As I discussed in this article, most of the original reasons that stopped Windows / Mac users from using Linux years ago are no longer valid. However, the irony is that it's easier than ever to get by with a Free Software desktop, but harder than ever to avoid proprietary software and lock-in, thanks to the rise of SaaS and the personal data cloud.

I agree with de Icaza that the "Open Web" is more important these days than a Free Desktop. But the linked Wired article's conception of Open Web refers to things like HTML5, JavaScript and CSS. These aren't the problem. They are an open delivery mechanism, yes, but usually for proprietary software.

Modern SaaS applications accessible through the web browsers using open web standards are the modern equivalent of an open source Perl script wrapping calls to a closed-source, statically-compiled binary.

You can read more about my thoughts on this in "Cloud GNU: where are you?" http://www.pixelmonkey.org/2012/08/18/cloud-gnu

in a q&a round at aalto university in finland linus adressed the question why linux never took off on the desktop: the lack of being a pre-installed os. he mentions that without preinstalled operating systems there's now way to gain a significant market share in the desktop segment.

the whole talk by itself is very recommendable: http://youtu.be/MShbP3OpASA?t=23m45s

Thats just bullshit. The Linux desktop maybe isn't broadly accepted or mainstream, but I dont see the Problem in that- after all Linux remains a system for power users, even if some Distros want to change that. And there really is no better desktop environment for those people than the Linux desktop. Windows is shit incarnated, so lets not even begin to talk about it. What remains? Mac OS. Sure, it has a more accessible GUI, but not a more efficient one. I cant think of something more elegant than a tiling wm, be it awesome, wmii or xmonad. Everything based on moving a cursor just feels awkward in comparison to the simplicity of ~5-10 keyboard shortcuts. And tiling also means that I always have everything in front of me. Fumbling around to find some window is HORROR.

I think the Linux desktop simply has more options for experienced users. I simply see no way how I could be more productive with a GUI designed to cater to lusers.

It's still very much alive if you don't give a shit about normal UX conventions or popularity, and there are hundreds of thousands of excellent 3rd party applications that run perfectly.

It's getting really irritating when someone who's jumped ship to OSX declares it "dead" because they have a shiny iDevice and an expensive laptop.

Largely the same issues that killed UNIX as a viable desktop alternative are the same issues that are killing Linux as a viable desktop alternative: Fragmentation and lack of consistency across different distributions.

This is compounded by most distributions having a lack of centralized vision on how everything fits together. They are merely a collections of individual parts rather than a collection of parts that are designed to work well together and they lack the polish as a result. While the lack of centralized vision was fine for SunOS circa 1992, it simply doesn't cut the mustard in 2012.

Ubuntu seems to be trying to push such a centralized vision with Unity, but I fear they lack the clinical editorial willpower to make the hard decisions required to see it through to its ultimate conclusion.

Compare Linux to OSX makes no sense to me. Linux has always been missing features when compared to alternatives; the GNU system was actually written to emulate the alternative.

But the GNU/Linux project had a very different objective. Fighting for freedom. If it is still freedom the driving force, then we should encourage the enthusiasts and get back to work on improve Linux, as it has been done for the past years. By doing so Linux already reached the excellence in some fields.

If you're just competing on features, you'll be missing some great benefits and enjoyment. And to be honest, in terms of features OSX isn't that good either as Windows is still used by the majority for one reason or another.

Miguels affection towards his iPhone is a bit unconsidered and superficial.

But anyway, a more interesting question could be: What does it take to bring an ex-linux user and now happy OSX user back to linux?

I used Windows for 3 years, then linux for 2 years. During that time I did a lot of installations (mostly ubuntu and debian) on a lot of different devices. During this time, while fighting with drivers, minor display problems, and spoiled windows users I lost my faith in linux as a desktop os and switched to OSX.

I can just speak for myself, but this few points would bring me back to linux in no time.

Presenting Distribution "Utopia"

1. No X11 based display stack, it is replaced with something conceptually simpler (like cocoa).

2. (Multiple) monitor recognition 100% accurate. (Probably connected to Pt. 1)

3. The audio setup is not much worse then the one of OSX.

4. Throwing Gnome and everything that is based on Glib out. It's 2012 there alternatives to faking oo with C. Qt isn't allowed either.

5. Throwing APT out. No more dependency management for a desktop OS please. Then kill Perl as requirement for running an os.

Ahhhhh, I feel better now :-). This is the opposite of what Miguel demanded, he cares for backward compatibility.

When I think about it. "Utopia" would be similar to Android. No fear to throw old stuff out.

Android as a foundation for a new desktop linux?

1. If this is true, and it seems right to me, maybe some of the massive effort put into designing new GUIs for Gnome/KDE/etc should be put into hacking the look and feel of the OS X desktop?

Unsanity ShapeShifter hasn't worked since OS 10.4

and I know about

http://magnifique.en.softonic.com/mac - 10.5 only

http://www.marsthemes.com/crystalclear/ 10.7 support claimed, but it's not very radical. I'd love xfce's window look controls or a Stardock windowblinds.

I know Apple don't want anybody to do this. I know they will deliberately introduce changes that break hacks. But as I said, how can it be more effort than Linux?


2. To try to prevent OSx86 hacks, DSMOS.kext uses crypto to prevent the system running essential UI elements like Finder, SystemUIServer, etc. Can't we build our own versions of those parts? http://en.wikipedia.org/wiki/Apple%E2%80%93Intel_architectur...


3. Is this true?:

Linux desktop - dying, dead

Windows 8 - trying so hard to copy OSX/iPad/Springboard/Launchpad that everybody is gonna hate its TWO UI's! (dying?)

Mac - winning, won (by default?)

I both use KDE and OS X. Some details are different but the ideas behind are mostly the same.

This article hits so many sore spots right on a pustulent scar tissue.

I had run a Linux desktop (a Debian build mostly w/ KDE) for a while and kept getting hammered with random stuff breaking for random, and often poorly considered, reasons. I gave up and went back to running a Windows deskop with a X-server to pull up windows on my Linux box.

Then I went to work for Google and they did a really good job of running Ubuntu as an engineering desktop (calling their distro Gubuntu of course) and I thought "Wow, this has come quite a ways, perhaps Linux has matured to the point where there is at least one way to run it reliably. And so I installed Ubuntu on my desktop and tried that for a while.

For "using" it, it was for the most part ok if once every few days I did an apt-get update/upgrade cycle. For developing it was a real challenge. Pull in the latest gstreamer? Blam blam blam things fall over dead, update their packages (sometimes pulling git repos and rebuilding from source) to get back working, and now apt-get update/upgrade falls over the next time because you've got a package conflict. It is enough to drive you insane.

Proud Ubuntu user here. Ubuntu 12.04 is not bad at all. Supports the fancy font he used on his blog. Flash is working. WebGL is working. LibreOffice opens Word docs when I need to. Audio is working.

I have Windows 7 on the other partition mainly to play games.

There was a minor issue with Ubuntu trying to melt the CPU in my laptop the other day, but its not so bad since I upgraded, and I found this powertop thing that also helps.

i guess if you don't mind that osx used 5% active cpu just for flashing bubbly buttons that's alright.

i like osx, I think it does have a good ecosystem for GUI APPS. but at pretty much everything it fails. It's a performance nightmare and the filesystem makes me want to punch a kid in the face(yes sorry, I also don't think you should be doing opengl in javascript, but hey) everytime it kills the cpu.

Now, with all the mentioned above I do wish there was a better ecosystem for app development. I mean something like xcode 3 not 4. Yes we have QT, yes we have glade, but build an app with the interface designer and bindings, mvc concepts and it just helps a lot.

You can do most of it with Vala, granted, it's just shittier documented and not as "round", there are no standard concepts to follow, etc. And yes, I do like my linux customizability, but we have stuff like CERT best practices for secure C coding. Why can we not get something like that for linux gui programming.

ps. gnome3 can go right where it came from

In like 2006 I switched from Windows XP to Linux. This was before Ubuntu was what it is today. I learned using Slackware and eventually switched to Gentoo. It was cool and gave me nerd cred when I went to college.

I switched to OSX for exactly the reasons the author mentioned. The fact that I have an awesome UI + ability to use the shell all day is a huge win for me.

The machine that ran Windows XP can run Linux (http://wiki.debian.org/Hardware). So you can switch to Linux any time.

However, people cannot switch to OSX. The machine that ran Windows XP and Linux WILL NOT run OSX. So, no, you didn't switch to OSX, you purchased new hardware that is strictly controlled (motherboard, video card, etc).

Then, in an incredible blunder, de Icaza said, "Many hackers moved to OSX... working audio, working video drivers, codecs for watching movies..."

Uhhhh, yeah, if Linux gave up on supporting scores of hardware platforms and hundreds or thousands of hardware components, OF COURSE it would have working audio, video drivers, and codecs.

I installed this thing called "EEE Ubuntu" on my little EEE PC a few years ago. That "EEE" in the name meant, or so the website very strongly implied, "specially designed for the EEE PC". I think there were just 2 fundamental models of it at the time, both very similar in terms of underlying hardware.

Needless to say, the sound didn't work. And the wireless didn't work. When I clicked "suspend", it said it was out of swap space. When I closed the lid, it crashed.

I don't get all that OSX "awesome UI" everyone talks about. With my previous experience on Windows/Linux, everything seems wrong and backwards, and there is no way to configure or change it. Also the shell looks like a crippled version of unix.

As someone who uses the shell all day, I found that there is no alternative for portage and Konsole.

And since most things are web-based now, a vanilla KDE setup + Icon-Only Task Manager is awesome enough for me.

For what it's worth, GNU/Linux never really was about some desktop conquest, so this whole discussion "What killed the Linux desktop" is quite absurd.

That aside, what we have here is a thread apparently devoted to shitting on the work of people who built something for fun and gave it away for free.

Good job folks!

I would have to say Apple killed Linix. As many others have noted here, OSX has improved to the point where many Unix admins run OSX and it runs the tools they have for their work. Also Mac hardware is better than PC hardware so you buy a macbook with OSX and you are happy.


Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact