Hacker News new | past | comments | ask | show | jobs | submit login
What killed the Linux desktop (tirania.org)
344 points by foolano on Aug 29, 2012 | hide | past | favorite | 378 comments



This is the money quote:

> The second dimension to the problem is that no two Linux distributions agreed on which core components the system should use.

Linux on the desktop suffered from a lack of coherent, strategic vision, consistency and philosophy. Every engineer I know likes to do things a particular way. They also have a distorted view on the level of customization that people want and need.

I like OSX. Out of the box it's fine. That's what I want. I don't want to dick around with Windows managers or the like. Some do and that's fine but almost no one really does.

Whereas Windows and OSX can (and do) dictate a topdown vision for the desktop experience, Linux can't do this. Or maybe there's been no one with the drive, conviction and gravitas to pull it off? Who knows? Whatever the case, this really matters for a desktop experience.

I have two monitors on my Linux desktop. A month ago full screen on video stopped working. Or I guess I should say it moved to the center of the two screens so is unusable. I have no idea why. It could be an update gone awry. It could be corp-specific modifications. It could be anything. But the point is: I don't care what the problem is, I just want it to work. In this regard, both Windows and OSX just work. In many others too.

I can't describe to you how much torture it always seems to be to get anything desktop-related to work on Linux. I loathe it with a passion. I've long since given up any idea that Linux will ever get anywhere on the desktop. It won't. That takes a topdown approach, the kind that anarchies can't solve.


> I have two monitors on my Linux desktop. A month ago full screen on video stopped working. [...] In this regard, both Windows and OSX just work.

Just because it's never happened to you on OS X or Windows doesn't mean it doesn't happen. OS X 10.6.7 broke the output on my 13" Macbook for either of the two external displays I own. Both worked fine previously, when booted from the install CDs, or from Linux on the same machine.

Plugging in my Firewire audio interface on the same machine spun the CPU up to 100% and kept it pegged there. A lot of good having a nice mic pre-amp does when you get a high pitched fan whir added gratis to all of your recordings.

It's silly to pretend that Mac is somehow perfect in these matters. In my experience it's only been marginally better than Linux, if at all. And with Linux you have some hope of finding a solution, whereas for OS X you're pretty much hosed.


> It's silly to pretend that Mac is somehow perfect in these matters.

Straw man. Nobody said OS X was perfect.

> In my experience it's only been marginally better than Linux, if at all.

I used Linux on my desktop for several years and have now used Macs as well for several years. I won't say this is bullshit because I don't think you are lying about your experience, but I think you are extrapolating way too far.

> And with Linux you have some hope of finding a solution, whereas for OS X you're pretty much hosed.

Forums and mailing lists are "hope for a solution" while the genius bar is "pretty much hosed"? How on earth did you arrive at this conclusion? That just doesn't seem reasonable.


I don't see any material difference between "hardware compatibility just works" and "hardware compatibility is perfect".

As for fixing the issue, both of my issues were kernel issues, as are most hardware compatibility issues. On OS X reverting to a previously working kernel would have meant backing up all of my data, reinstalling the OS from the DVDs, installing the combo update to the last working version of the complete OS (10.6.6), restoring my data (but not using Time Machine) and then never allowing another OS upgrade, including not updating the apps that come with the OS.

On Linux it would have been a matter of downloading the previous kernel package and installing it. That's it. The package managers are usually smart enough to figure out that you've forced a previous version of the package and won't replace it in future updates.

Neither solution would have been doable for a novice, but the Linux solution is far easier for an expert and a generally more palatable solution. And note, this was Apple's OS breaking Apple's hardware.


I still hold that Linux has a lot of "usability." It is extremely useful, extending itself into every possible use I can think of. It is an extremely useful tool.

That said, maybe Linux in it's current stage isn't destined for the pickup approach of Windows or OS X. I believe that Linux is not following the same path as Windows or OS X, and because of that it may not end up at the same destination.

I agree with the original article; Linux is produced differently, often with different goals than Windows or OS X. Trying to shoehorn it into the same path as Windows or OS X may not prove fruitful till we see someone with the drive to change everything onto the design path that Windows and OS X are taking.


Since you mentioned using Time Machine, why not revert to a good state using that (instead of a re-install)?


I really don't want to have to think about the kernel when I'm plugging in a new monitor.


Neither does anyone else, but when faced with a kernel issue that's what you're stuck with regardless which OS you use.


Strawman argument invalid: Mac being marginally better is a valid point because GP builds his argument on Mac being better.


Genius bar solution: Wipe device.

Every time.


That's the genius of it.


Really? That reminds me of Microsoft technical support circa 1993:

Step 1: have you tried rebooting the machine?

Step 2: have you tried reinstalling Windows?

I thought those days were over.


To stress out again; OS/X has the poorest hardware support for 3rd parties; even compared to Linux. Just "think different" and buy some hardware and you ll see what I mean..


Oh well now that you said it the argument is settled. You can't just say OS X has the poorest hardware support ever and not back it up with anything except "go buy some hardware and see what works and what doesn't". Is there maybe a table online that lists these incompatibilities you can link to? Please don't say "do some research" because the onus is on you to prove yourrself right, not on me to prove you right.

You also said "even compared to Linux". That's interesting. That statement there gives the impression that you're more interested in defending your own beliefs and/or choices rather than being truly interested in explaining to us what OS has objectively terrible hardware support. "even compared to Linux" sounds like something an apologist would say. Then when you added in the "think different" line you made it seem even more like your comment was based off some kind of blind loyalty to Linux rather than loyalty to facts.

Me? I've used Mac, a handful of Linux distros, and Windows for a long time. I don't know which has the best or worse hardware support but I do know when someone says something based on what camp they're in rather than what the facts are.


> Then when you added in the "think different" line you made it seem even more like your comment was based off some kind of blind loyalty to Linux rather than loyalty to facts.

Don't read too far into things, and "...loyalty to facts" eye roll.

Anyway, It's pretty common knowledge to Linux users that hardware/driver support on Linux can be a challenge. Sometimes things work right out of the box, other times you have to do a lot of work and a lot of Googling.

That being said, I don't agree with the parent. For basic hardware needs, Windows and OSX have a fairly high success rate of plug-and-play functionality. For me on Linux, it's about 50/50.


On windows, a fairly high success rate? This must be a joke. Try installing a graphics card without driver, you will see how well it is supported under windows. I consider that basic functionality. Same for wifi dongles, they usually need drivers to be installed in windows, while this is all taken care of at the kernel level in Linux.

In Linux you very rarely have to install any driver. True, some hardware remains unsupported, but the list of compatible hardware without any installation required is pretty long. On windows, you'll need to install stuff for most of new hardware you try to plug in, no matter what.


The fact that you have to install some driver is irrelevant. I'd rather install drivers for Linux than be unable to use the hardware at all.


I've installed both GTX 580s and 570s with absolutely no problems.


OSX was the first system where a printer worked without not even a popup saying that "drivers were being installed" (and I'm not even the biggest Mac fan, I'm typing this on Windows 7...)


The real difference is that on OSX is if something doesn't work it doesn't work. Under Linux it may work, or work badly, or you can probably hack it to make it work well. I used to think I liked the latter but there's something appealing about the binary clarity of OSX.


Thank you, that's exactly how I think, it's even this way with Windows. On other systems I would waste days after days of my life to set things up 'just the way I want' only to find that it's never gonna be quite that way. Macs just let me accept the things as they are and get back to work. Very often, I'd get to see after a while, why things work the way they work (say, Window management).

Anyway, this has been almost 100% true up from 10.3 through 10.6.x. Recent developments both on Apple's as well as on MS's side have closed the gap between the two systems quite a bit. I still prefer OSX but it's gotten close. The real reason I continue to buy macs nowadays is the hardware quality.


>On other systems I would waste days after days of my life to set things up 'just the way I want' only to find that it's never gonna be quite that way. Macs just let me accept the things as they are and get back to work.

This. It's almost as if Linux is a collection of parts and tools from which a sufficiently good designer can build a usable desktop whereas OS X is a usable desktop. That's an exaggeration, but for me there is some truth to it.

Note that Linux was my only desktop from 1997 through 2009.


OSX seems to have had this design paradigm: 'Make everything work in the most streamlined way possible for the default use case (80%). Give some limited options for a few more specific use cases (18%). Cover everything else with CLI integration / defaults system, such that the final 2% can get around with a bit of googling'

This has been highly successful, only I'm afraid that the Unix people inside Apple are loosing influence. The sandboxing in ML looks like a mess to me. Not that its a bad idea per se, but look what it did to our beautiful '~/Library/[ApplicationSupport|Preferences]'! It's tacked on and it's obvious that this has been a political decision from management rather than engineering. Makes me angry.

On a different note: I've never really used it extensively because I switched to macs in '05, but wasn't Ubuntu pretty close before they switched to Unity? I remember last time I installed it that for the first time I could have a Linux that worked out of the box, including network and graphics drivers.


I've actually switched to doing my work on Ubuntu (using xmonad - I can't stand unity) and iOS (iPad as SSH client) as OSX only cares about the 80% use case now.

Which is fine, it's Apple's decision to take, but it's no longer for me.

(As for Ubuntu - I'm using pretty standard hardware and never had a problem)


Part of it is you won't spend as much time thinking about it. Say you have a wireless keyboard that is a little dodgy on Linux, but works most of the time. But every now it fucks up and really pisses you off. If it never worked on OS X you wouldn't even remember it because you would have sent it back or resold it.


Printer drivers seem fine. Scanner drivers seem fine. USB mouses and keyboards work fine. Most Apple products can't really be expanded in many other ways.

I'd say the hardware support is fine for what little customisability Apple allows on the hardware that they sell. Perhaps the Mac Pro might have some issues with internal cards (I honestly wouldn't know ...), but those devices are not affordable for 99% of the people.


Wait what? Every scanner, printer, tablet, mouse, and even color correction device I've bought has worked on my Mac.


Then you've been selective about what you've bought. It's trivial to find devices that are Windows specific and/or have only gotten Linux and/or OS X support after extensive rounds of reverse engineering.

Low end printers and scanners in particular are notorious for skimping on the logic in the device itself and depending on a large blob of Windows driver code, with manufacturers that couldn't care less about supporting anything other than Windows.


I have no idea where this sentiment is coming from. I'm typing this from an MBP with a das keyboard, navigating with a razer mouse and viewing with a samsung monitor, everything works fine.


All keyboards are the same, all mice are the same, spectral analyzers are not the same, software-defined radios are not the same, USB-Serial adapters are not the same, FPGA interfaces are not the same, sensor boards are not the same, laser cutters are not the same, need I go on?


Hardware? What hardware? Frankly, I havent bought nothing over last 10 years. I have laptop, you see...


I bought Samsung Galaxy S3 recently, plugged it in via USB and was not able to browse it because, as far as I understood, MacOSX has no support for "media devices" whatever that is. I had to download some obscure Samsung "Kies" software through which I was able to get to the filesystem and upload some files. Wasted hours.


Starting with Honeycomb the storage partition is exposed through MTP (or PTP), this way it can be accessed concurrently by both Android and your computer. Before Honeycomb, said partition could be accessed as an USB mass-storage drive but that required it to be unmounted, thus making all data unavailable to Android while it was connected to a computer.

In my experience only Windows 7 has passable MTP support. On Linux I had mixed results with gvfs and mtpfs (slow and crashy). jmtpfs [1] is a nice replacement, though. Google also released an application for OS X [2] but I can't try it since I don't own an Apple computer.

My solution to the problem was to install a WebDAV server (such as [3]) on my device (Galaxy Nexus) and access it wirelessly.

  [1]: http://research.jacquette.com/jmtpfs-exchanging-files-between-android-devices-and-linux/
  [2]: http://www.android.com/filetransfer/
  [3]: https://play.google.com/store/apps/details?id=com.theolivetree.webdavserver


This is not an issue with Apple or Samsung, it's an Android decision.

Support for the USB Mass Storage protocol has been deprecated in Android in favor of MTP (Media Transfer Protocol) starting with Ice Cream Sandwich.

The experience in Windows and on Linux is just as poor with this phone. (I love the phone, though.)


Why can't Apple support MTP? Linux gets cited for lack of hardware or protocol support (MTP to an Android device works fine in Linux, btw), but it's not Apple's fault when they can't or don't provide drivers for common, standardized consumer electronics?


It seems fairly obvious that the reason is that facts which do not fit the narrative are not talked about. Confirmation bias in action.


MTP does not work fine in Linux (specifically, Ubuntu 12.04). You can read the files but not write arbitrary files into arbitrary folders.


I haven't had any trouble using a Nexus 7 with Windows, it's worked like a charm. What's supposed to be the huge problem with MTP, exactly?


I take that back, my Galaxy S III does work as expected using MTP with Windows 7. Previously my workplace was using an older Windows.


Oh my

This is 100% to blame on Samsung, sorry.

Even though my Samsung phone is seen as a USB device (and it works on MAC OS X), but maybe in the newer models they removed this functionality (and called it a feature)

People may complain that iPhones need iTunes but then again it's iTunes not the gigantic pile of crap that is Kies


No it has nothing to do with Samsung. In ICE Google changed how the partitions are handled: http://forum.xda-developers.com/showpost.php?p=11714077&...

It was pure engineering tradeoff to improve things in the long term. If anything OEMs would have preferred the old 'just works', sub-optimal, gives-an-excuse-to-obsolete-a-phone, lower support issues, solution. The ball is now in operating systems' court to support this standard in a way it wasn't envisioned to work a few years ago.

Even ubuntu has been slow on this front.


Nice explanation, if it's to facilitate upgrades I'm all for it!


That's the thing about samsung. They copied the bad stuff as well: weird connectors, itunes / kies.


This is samsung not following USB standards.

OS X has perfectly find support for media devices, and has worked fine with every digital camera, SD Card or other card reader, etc, that I've attached via USB in the past decade!


Actually it's lack of MTP support in OS X.

http://en.wikipedia.org/wiki/Media_Transfer_Protocol


Wrong. It's Google's decision to abandon direct mounting Android drives in ICS in favor of the MTP standard.

Which is the right move in the long term, although it's certainly causing some pain in the short term.


Everything under the sun supports UMS, that OSX does is nothing special. That is not what his device was using though.


Laptops are not hardware now?


I have no idea where this sentiment coming from. I'm typing this from an MBP with a das keyboard, navigating with a razer mouse and viewing with a samsung monitor, everything works fine.


I've been using OS X going back to the betas. I've never seen OS X have problems with third party hardware that conformed to standards.

I have seen problems with third party hardware that required its own driver (poorly written generally) or that was half assed crap that was designed to work with windows but marketed as supporting standards like "USB".

Meanwhile, Linux has trouble dealing with the computer itself, let alone getting hardware attached to it. Constant pain in the ass.

Currently I'm in a country in a different hemisphere from the USA. I walked into a store and bought and off the shelf no-name brand displayport to HDMI adapter. Worked no problem.

I've gotten used to being able to do that... and haven't had much trouble getting things to work with OS X in years.

Course, I also stopped buying things that require custom drivers-- that's a clear sign you're not going to have a good time.

To claim that Linux is better in this regard is, quite frankly, asinine.


In my experience, Linux works great with standards-compliant hardware. Linux is typically the first platform with support for a new standard. The problem is there is so much hardware that is not standards-compliant.

Your comment looks about as asinine to me as the as the parent comment does to you.


Being the first platform supporting it in the kernel - and then you have to wait for another few months (or years in case of Debian) until you finally have the driver in your distribution. The alternative is compiling kernel-modules yourself - which then get broken again by the distro whenever it updates the kernel without warning. And to even compile you have to follow hints on several websites until you figure out the way to compile a module just for that kernel-version you are having (this all is from experience trying to get a Wacom tablet running - I gave up on it when it broke again after a distro update and the compiling mechanism then also no longer worked for some reason. Now waiting for Debian Wheezy to save me).


"Currently I'm in a country in a different hemisphere from the USA. I walked into a store and bought and off the shelf no-name brand displayport to HDMI adapter. Worked no problem."

That really has nothing to do with the OS you're running. The OS isn't driving the DisplayPort<->HDMI protocol conversion in any way (your 'no-name' brand adapter is probably relying on DP++ and just passing through the HDMI signal anyway).

This is like saying your VGA cable is compatible with Macs with a VGA port. Of course it is.


> I've never seen OS X have problems with third party hardware that conformed to standards.

The problem is that hardware often doesn't really conform to standards, even those it claims to.

One of the reasons device support in Linux (and other OSes which target a wide range of platforms) is difficult, is that it's necessary in practice to cope with and have workarounds for buggy and out-of-spec hardware and firmware. Just "coding to the standard" isn't good enough.

Such buggy hardware/firmware is rarely documented as such, and finding these problems and the appropriate way to handle them is painful and difficult work. In some cases the only practical way to figure out what actually works is to reverse-engineer what Windows does (the hardware manufacturers generally make sure that Windows works with their hardware, but rarely make such information public).

Apple's main goal is their own hardware, over which they obviously have a lot of control and information, so they really don't need to worry so much about this.


If the hardware conforms to standards, Linux is the most likely to be able to support it of all three - and that's even without having to install a driver.


>that was half assed crap that was designed to work with windows but marketed as supporting standards like "USB".

A device can follow the USB standards (and be labeled as such) while still requiring a special driver. The two are not mutually exclusive.


See, according to me, you don't get the point. Things DO break in OSX as well. Just that its a bit less. Apple has the liberty to control both the (primary) hardware as well as the OS which runs on it. Therefore it has an edge over Linux in terms of compatibility.

Having said that, barring a few "recent" desktop distros, as they call it, like Ubuntu, Linux has always been a bottom up approach. You need to install the base, then the userland and so on. For someone who is looking for ultimate customization, Linux makes sense.

But when I am running a notebook, I seriously want everything to work, albeit at a lower performance/customizability.

I always run either FreeBSD or Gentoo on my servers. But my notebook/desktop continues to be OSX.


I had a similar 100% CPU issue with an ExpressCard on a MBP... it was even labelled Mac Compatible.


Among non-ultra-geeky Linux users I know (mostly part-time users), "Linux" does seem to be reasonably standardized as far as desktop components go, because by "Linux" they mean "Ubuntu". It's possible the Ubuntu desktop dominance came a bit too late, though, and still isn't entirely complete.

My own experiences aren't great with Windows, though they're smoother on OSX than Linux. The problem I have with Windows is that stuff does break, but it's inscrutable how and what to do about it, whereas at least on Linux there is usually some way to do something about it if you're tech-savvy. If anything gets borked in Windows, it's wipe-and-reinstall time; back when I used it full-time I probably did that once a year on average.

I find OSX to be poor for package management, though. I've gotten so tired of headaches with installing and managing updates of stuff, and badly interacting upgrades of the OS and separately installed packages and system versions of python/ruby/etc., that I do most of my dev work in Debian in VirtualBox now. In that aspect, it's Debian that's centrally managed with real release-engineering and integration tests, whereas OSX is an anarchic mess of 3rd-party software that nobody's responsible for integrating and testing.


Windows is horrible to use, everything is poorly designed. Linux (Ubuntu in my case) and OSX just work sanely mostly, but OSX is a horrible dev environment, so I moved back to Linux.


I thought Windows was a joy to develop on until I moved to Linux... then I realised MS had to develop all these wonderful dev tools to disguise the fact that the API's are disastrous and the command line is useless...


Amen. That's the thing. You don't realize how many hoops you've been jumping through to do the same things that devs on Linus or OS X do effortlessly until you make the switch. But people don't make the switch because they have this blind loyalty to whatever theyre using. So whenever they see someone else loving a different OS they dole out ad nominee attacks, call the products over priced, or too complicated for normal users, or elitist. It's all to do with justifying one's decision to use X despite evidence that Y may be better or just as good and possibly some jealousy in some cases.

In any case it's all dumb blind loyalty the vast majority of the time. And that applies to Linux, Mac, and Windows. You can't have a discussion about operating systems with 90%+ of people because everyone just defends their favorite and facts are distorted to suit those preferences. It really is just about useless to have a discussion about operating systems as "substantive" or "meaningful" discussions about the subject are about as rare as unicorns.


Actually OSX development environment is only horrible to cross-platform / open-source application developers.

But if you are Apple pet developer then you'd just use XCode/XCode projects/etc for all your needs. And would be very happy with that.


As a cross platform open source developer that uses Mac OS X and Linux about the same (Mac OS X on desktop, Linux on laptop) I don't see what's so horrible about Mac OS X in this regard. At worse, it's the same. I don't prefer either for programming (I do prefer Mac OS X for other things, though).


Well, developing using native toolchain is horrible. Because of legacy gcc 4.2 and unstable and often broken clang. At least about a year back it was the case. No. Scratch that. It is still the case. I just remembered helping a coworker hack some open source package to make it compile with clang. We lost an hour or so, completely unnecessarily. Would have taken him 30 seconds to apt-get it on linux.

And well, not using the native toolchain, and using something like mac ports feels like I'm under cygwin all other again. No, thank you. I love 13" Air, but as a development environment Macs are quite broken.

Plus that home key on the full desktop keyboard... If you are linux+mac user, you know what am I talking about :)

Curious, what package management system are you using to install libraries/packages? Ports? Fink? Homebrew?


Okay, our use cases are very, very different. I mostly do Go now and used to do lots operating system development and embedded work. For Go, your compiler complaint does not apply, and for embedded work you need to build or install a cross compiler anyway. There are some cross compilers which you can get with apt-get, but in practice for the work that I do, there are no distribution provided tools, and you need to maintain your own toolkit.

Same applies to libraries, you can get everything you want with apt-get, as long as you only want things for your architecture. If you need them for another architecture, you are out of luck.

For operating systems you use virtual machines. I use the host just for running my text editor and for my Unix environment needs. All operating systems I've worked on (Solaris, Windows, BSD, QNX, some other real time embedded stuff) have their own tools and compilers used to build the system.

MacPorts, Fink and even Homebrew are FUBAR, I'm glad I don't have to use them.


>Windows is horrible to use, everything is poorly designed

Without qualifiers, that statement is meaningless. Perhaps you mean it's horrible for some devs? (Millions of others are doing just fine with Visual Studio).

For regular users, dropping them to Linux will be like replacing your mom's car's controls with the controls found in an airplane cockpit(command line) or playing "who moved my cheese ?" with the Linux desktop. Granted, everything is moving to the web these days, so why not just give them an iPad or a Chromebook?


Ubuntu works just fine for two kind of users:

1.) Grandmothers and the like who just want to read their mail and check the news.

2.) People like me who can and will use 4 + 2 + 1 + 0,5 hours to customize it over the course of 6-12 months to have an almost perfect working environment instead of living with a slower OS without virtual desktops.

The only thing I can think of that needs tinkering is some online banks that require applets to work, but then again even Windows doesn't come with Java preinstalled.


I think you mean Debian.

I know lots of people who successfully installed Ubuntu that don't know what a boot sector is but are still happy with it. (After all most people use web apps anyway.)


Wasn't talking about visual studio, I mean the UX as a whole. It is hugely inconsistent, slow, and hard to get sane information about whats going on. Once you are in an application, then generally its ok, its just up to that application, other than things like where new windows are created which are also broken. Oh and focus and switching, as things are always getting hidden.


Application UX inconsistency is a Mac thing AFAIK. Yes they all look 'pretty' and 'likable', but not consistent. Most decent Windows apps are modeled one after another and are very consistent. They have such things as status bars, sidebars, menus, list views. Usually standard controls, though times are changing with the advent of XAML and web-like interfaces.

And man tell me where should I have gone to get any information about why my _wired_ Ethernet connection wasn't working on Mac when it worked everywhere else?

[UPD: I'd give iOS apps 2/5 on consistency scale, Android, Linux and OSX 3/5, Windows 4/5].


Lots of programmers were doing just fine with punched cards and teletypes. That doesn't mean that punched cards and teletypes were not horrible. People just learn to be comfortable with their tools.


I really haven't found that that be the case at all with package management on OS X. Homebrew is almost always a joy to use, and on the few occasions I've had build errors using it I've found the Homebrew community connected to the official GitHub repo issues page to be one of the most responsive, friendly, and genuinely helpful open-source communities I've ever encountered. I agree things were a lot rockier back when the only package management choices were MacPorts or Fink, but Homebrew's a monumental step forward.

I can't say I've really touched Python at all, but as a relative newcomer to the Ruby world I found getting RVM set up and functioning properly on my Mac to be a complete non-issue.


Discovering Homebrew was wonderful after suffering the broken-by-design ports systems of Fink and Macports (a retrograde step from Darwin Ports). But it is not really strong competition for Debian's APT. For example, a day after first using Homebrew, I discovered that the Qt port simply didn't work. I fairly frequently have to install by hand software for OSX that has stable packages in Debian.

It's also true that Homebrew (and Macports, but not Fink) are doing something fundamentally less ambitious than the main Linux package managers: it offers a simple dependency system and scripting for compiling stuff, not directly installing the end product.


One significant difference between linux and windows/mac is the planning around features and the outreach. I work for a large organisation. The number of times a linux group has come round to our office to discuss what features are up-and-coming, or should be developed? None. The number of times Microsoft staff have done the same? Some.

Now I don't really expect linux volunteers to do this, but equally I don't really expect any linux distro to provide a coherent business desktop because they are not talking to their "customers" in the same way that their competition is. And the business desktop is a large market, if a bit unsexy.


There was always a dominate player it seems. Most people used RedHat (before Fedora) over Ubuntu.


>badly interacting upgrades of the OS and separately installed packages and system versions of python/ruby/etc., that I do most of my dev work in Debian in VirtualBox now

Well, sure, but (unless I am mistaken) that is no longer on the topic of the Linux desktop.


I'm experiencing similar frustrations after having returned to using Linux (Ubuntu) at work recently. I've spent a fair bit of time having to customise a window manager (awesomewm), just to get a somewhat usable desktop environment. I work at a university with centralised mail handled by Exchange and struggle every day with flaky, buggy mail clients Evolution/DavMail+Thunderbird. Under linux my macbook air only sleeps reliably half the time, and occasionally I find that unbeknownst to me some process has woken the laptop (with lid closed) in my bag, with the battery near critical. The latest apt-get update broke my sound. Empathy hangs for a good 2-3 minutes on load, but eventually recovers. Something mysteriously reverts my xmodmap keybinds mid-session. Other little things such as LightDM not starting until I hit enter at a black screen post boot contribute to an overall experience that feels frustratingly incomplete.

I know that many of these are trivial complaints and that all of these things can be fixed. I no longer want to have to do this however. In my teens I loved tinkering with the linux desktop, designing 'awesome' Enlightenment themes and enjoyed tracking down and fixing these little problems. Now in my 30s I really just want the OS to get out of the way and let me code. OSX does this well, linux really does not. I say this with a fair bit of sadness as there are many things that I like about linux a great deal - the cli, the excellent package management, the ethos behind the OS and the open source community.


I had nothing but trouble w/ Ubuntu when trying to use anything non-Canonical tested. I suggest using something like Archlinux where you're not forced to stick with buggy software for a 6 month cycle.

And I've been really, really impressed with the latest Thunderbird/Lightning releases (14-15). Haven't had any problems at all, like I had with the older releases from several years ago. I used to be a Mail guy, but Thunderbird blows it out of the water these days.


> Linux can't do this. Or maybe there's been no one with the drive, conviction and gravitas to pull it off? Who knows?

I say Mark Shuttleworth is doing this, and no one is paying attention. Unity was a unified vision, it was just different and no one wanted different. And that is fine for a new target audience he wanted for the OS, which was everyone else in the world. Look at what they are doing now, Ubuntu on tablets, Ubuntu TV, they are pushing a unified vision, and nobody is getting behind it en masse in the Linux community because it isn't everything they wanted. They want Cinnamon, they don't like Gnome 3, where is my Gnome 2, etc. The inherent strife is the proble, not a lack of visionaries.


Yes but most of the people who are "leaving Ubuntu" are just switching the desktop environment or using a system downstream of Ubuntu. It makes for a lot of drama but it doesn't much hurt Ubuntu community support (which is their greatest strength, and why I use Ubuntu) because so many of the packages remain the same.


Two years ago, when it was experiencing a considerable upswing in use, Canonical was in the position to at least partly dictate some standards. And an Ubuntu install also would "just work" (and still does in my experience).

Unfortunately, I think they kind of snatched defeat from the jaws of victory by attempting to go to a "modern user interface". The big problem is that neither designers nor programmers really enjoy incremental progress (designers want a big canvas and programmers want clean code). Gnome2 really was/is good enough for most people use easily AND it was/is simple and powerful not to stand in the way of the existing users (programmers, Linux Geeks, Linus himself,etc). Whether Unity and Gnome 3 work for average people or not, they certainly alienated the existing Linux desktop users and that is an important segment.


Yes, yes that' is it right there, Unity and Gnome 3 both are terrible compared to Gnome 2. I wish the Gnome 2 team would re-materialize and come out with Gnome 4, just Gnome 2 with a higher version number.


Coming from the perspective of a developer:

I used linux as my only OS sans VMs from roughly 2008 till 2011. I finally gave up because it handled monitor switching, multiple monitors so poorly (using Nvidia GPU). I begrudgingly switched to windows because I owned the hardware already, and it was the only option. Windows sucks, the lack of a usable shell kills it, full stop, nothing else windows does matters. And no, cygwin doesn't cut it.

Fast forward to now, the only computer I manually interact with is a macbook air. The old desktop hardware became my file server(running FreeNAS) despite being many times faster than the air.

Through this experience I realized what I actually cared about...I run four 'apps' 99% of the time: chrome, sublime text(everything non java), iTerm and IntelliJ(java). That is really all I care about, anything that prevents interacting with these applications as quickly as possible fails. The support stuff, git, a database, a message queue, etc. Runs anywhere. I _rarely_ interact with the 'OS' in ui terms. I rarely use finder, I only use the dock to empty the trash (launch everything through Quicksilver). At the end of the day I love OS X not because it has 'awesome' UI paradigms, but because of all the options I see it the least, it may as well not be there.

OS X is the only OS i've ever used that I just don't think or care about. It sleeps, it wakes up, it deals with new screens, it does all the OS shit so I can just use the apps i want to. Thats why I am hooked using it as a dev machine.


As for the "lack of a usable shell" part, have you looked at Powershell? It comes bundled with Windows, and has a lot of the features that are missing from the command prompt.

It does have some rough edges that may cause you to dismiss it as unusable, but it offers tab completion, colored text output, and a real scripting environment (not DOS batch scripting). Every serious Windows developer I've talked to nowadays uses it exclusively.


> As for the "lack of a usable shell" part, have you looked at Powershell?

No, and I have zero reason to do so. I don't deploy on windows, I have no reason to ever deploy on windows, so why would I bother learning another syntax just to develop on windows? It really doesn't make any sense. Powershell only makes sense for IT admins whom have to manage windows networks, it makes no sense at all for developers deploying on *nix.


Your needs are strikingly similar to mine, down to the choice of apps and launcher. What I noticed after a week of full time Gnome3 Shell use was that it was similar enough to OSX (but with the niceties of a proper window manager) that I'm finding myself hooked. Makes my ThinkPad a reasonable facsimile of my MacBook from a keyboard-nav centric, "keep out of my way" perspective. I love OSX on the MacBook, but hate it when docked. Gnome Shell gives me a reasonable OSX-like Linux experience on the laptop, but with a great docked experience to match.

Can't speak to the seamless suspend/resume support, etc on your choice of hardware, and I acknowledge that if that kind of stuff doesn't work, it'd be a dealbreaker.

Just saying, Gnome3/Shell is at least worth checking out if you like the OSX workflow and use just a few apps day to day for development.

Thanks for the writeup of your experience!


"using Nvidia GPU"

Yup, still no xrandr support which makes any on-the-fly monitor config changes very difficult or impossible.

Sometimes, if you use twinview and nvidia-settings from the start and if the moon is aligned just right, you can dynamically configure monitors / video ports. Sometimes.


I usually have firefox, a terminal, sublime text, ftp client, irc client and a file browser open. All I use is openbox for a windows manager, tint2 for a task manager and dmenu for a launcher. I very rarely have problems with my linux computer.


Linux on the desktop suffered from weak developer and industry support. It was until some years ago, Linux sound system latency was as powerful as Windows 3.1's. Using proprietary drivers for X or even worse for the Kernel is a mess.

So obvious... Linus promised Linux on the desktop 2000 or so. The reality was different: influential groups just ignored the desktop. Now it's too late, web is becoming the predominant platform, Operating Systems are just commodity. For me it's no big difference whether I use Linux or OS X (with coreutils etc. installed). In fact I wouldn't even mind working on Windows, unfortunately I don't have the patience to setup a reasonable Unix-like dev environment.

The irony is: it has never been less painful to switch to Linux.


> The irony is: it has never been less painful to switch to Linux.

This is so true. Linux has a bad rap. I gave up on it a couple of years ago due to the infamous "hidden cost of linux". I switched back a couple months ago and have been pleasantly surprised how smooth things are now. A lot has improved in the last couple years in Linux land.


>have been pleasantly surprised how smooth things are now

Which version of which distro?


I have two monitors on my Linux desktop. A month ago full screen on video stopped working. Or I guess I should say it moved to the center of the two screens so is unusable. I have no idea why.

Is this in Chrome on sites like YouTube? If so, I believe it started with Chrome 20 when Chrome switched to its native Flash driver (see http://productforums.google.com/forum/#!topic/chrome/Mi-YgjN...). This happened to me too, and I still haven't resolved it.


If that was the only problem though we would have expected FreeBSD to overtake the desktop. the larger issue, unfortunately is just that people are afraid to switch from Windows to anything other than Mac. Android is helping there, but it will take time.

Never underestimate the fear of the consumer that they will break stuff on a computer they are unfamiliar with. The fact that Windows is actually getting better may actually be a good thing regarding other possibilities of desktop software.


As a FreeBSD user since 2007 I found the desktop options to be just as clunky as Linux if not more so. It was the only part of the OS that didn't just work when I used it in my travels, so it's stayed as server duty. If there was an open source windows managing system that was on par with Mac's version...whole new world. (There might be, I havnet' looked since 2009 so)


"Every engineer I know... They also have a distorted view..."

cletus you are one of the few people with enough karma to always grab and hold a top post spot who _actually says things worth saying_.

Right on.

As for "someone with the gravitas", never say never. But they might not use the Linux kernel and GNU as their "clay" or "base material".

I've thought about such a person and one thing I believe is that they would be foolish to try to "sell" such a simple to use system to Linux users (and many of those users would, alas, include engineers like the ones you describe: set in their ways) nor to those engineers who worship Apple, with their belief in some mythical "user experience" [translation: _their_ experience]).

In my opinion, a person with the gravitas would also need to the vision to see that the target user base who is open to change lies elsewhere. The users need to come with an open mind.

I don't know what will happen to Linux. But the binary blob problem keeps getting worse. Too many Linux users are happy to accept the blobs. They don't want to read source code. They just want working devices. That's understandable. But in my opinion it's not a small problem in the long run.

Apple, on the other hand, is flat out abusing its power. Not only over end users but over developers (who are of course just a particular class of end users). They are standing in the way of people's general education about computers. Keeping everyone dumb may give some developers a warm, fuzzy feeling as they watch their bank accounts grow to new levels, or dream of it, but I do think Apple's conduct, seen for what it is, will trickle down past just us fanatics on HN and elsewhere on the web. People are going to figure it out.

If I was a monopoly-lover, if I loved to see "winner-take-all" in IT, as if that improved the lot for any of us (I am not and it doesn't), then my money would go on Amazon. They seem to be making the fewest mistakes. Great things will come from AWS. It is the democratisation of hosting - a sharing of power, risks be damned (unfortunately, it may also mean the monopolisation of hosting as we've never before seen). In "Dropbox", we are only seeing the very beginning of what's possible.

In my view, Apple is a "disabler" (everything they introduce is restricted) while Amazon is an "enabler" (generally: they open many more doors than they close).


> I just want it to work...

We must be doing different things, because I've been on Linux desktops with zero-to-minimal effort since 1999.


Agreed that things should just work. In case you're still looking for a fix for the dual monitor full screen issue. I had the same problem on Chrome. It was when they introduced built-in flash. You need to disable Chrome's built-in flash and everything is back to normal.


Maybe by designing a set of base interfaces, with extension (ala opengl, w3c)


relevant (and funny cause it's true): http://xkcd.com/963/


Nvidia card?


I think some of this is perceptive. It's true that the attempt by both Canonical (Unity) and Red Hat (Gnome 3) to sort-of-incompatibly break away from the so close to standard that it hurts to type this Gnome 2 environment did a lot more harm than good, at least as far as platform adoption goes.

And clearly OS X is an extremely polished Unix and is going to appeal to the more UI-focused of the hacker set. And Miquel is definitely among the most UI-focused of the hacker set. He's also an inconsolate "platform fan". Much of his early work was chasing Microsoft products and technologies, of course; now he's an iPhone nut apparently, and that doesn't really surprise me.

But at the same time the Linux desktop was never really in the game. I use it (Gnome 3 currently) and prefer it. Lots of others do. For many, it really does just work better. But in a world where super-polished products are the norm, a hacker-focused suite of software isn't ever going to amount to more than a curiosity. (And again, I say this as someone who will likely never work in a Windows or OS X desktop.)

So in that light, I think the idea that the Linux desktop got "killed" is sort of missing the point. It's no more moribund now than it was before. It's more fractured in a sense, as the "Gnome" side of the previous desktop war has split into 3+ camps (Unity, Gnome 3 and Gnome2/Xfce, though there are other spliter camps like Mint/Cinnamon too). But it's here and it works, and it's not going anywhere. Try it!


So in that light, I think the idea that the Linux desktop got "killed" is sort of missing the point. It's no more moribund now than it was before.

I strongly disagree. It is losing exactly the sort of person that the author is: developers who, all else equal, would rather use Linux. But who eventually get tired of the BS and just want something that works and you can actually get software for. I have a lot of sympathy for that.

I write this from a Linux laptop, but that's more out of mulish stubbornness and 15 years of accumulated irritation with Steve Jobs and his dickish business practices.

The last time I got a new laptop I knew I didn't have time to screw around for days with X configuration files. And so I paid a vendor several hundred dollars over list to give me a laptop that JFW. And despite that the sound is still way too quiet. After a few time-boxed 2-hour excursions into whatever sound system they're using this week, I still can't fix it. I've given up.

The only legitimate reason I have for staying on desktop Linux is that I code for Linux servers, and I think it's impossible to really understand system performance if you're not running the same OS. But even that seems shaky to me; hardware keeps getting cheaper and developers keep getting more expensive, so it just doesn't matter as much.

One day some bright Linux spark is going to "innovate" again in a way that I'm expected to put up with their rough edges for 6 months (hello, Unity!) and I'm going to say fuck it and buy a Mac because I just don't have time to screw around right then. Or maybe I'll just want to watch a Netflix movie without hassle, or play the video game my pals are all talking about. And maybe by then it will be a fancy dock for my 8-core Android phone.

Overall, I agree with his point, except that I don't think Linux on the desktop is so much dying as cutting its own throat.


When you write "whatever sound system they're using this week", I know what you mean. There's terrible fragmentation, and buggy libraries get rewritten and replaced by some other buggy thing instead of fixed. That's definitely a pattern. From 2007-9 I even fled to OpenBSD on the desktop to get a more stable OS.

But for the past two and a half years I've used Ubuntu with Xfce, and those problems have become a distant memory. Nothing breaks, nothing gets changed on me (Xfce has moved a few things around, but nothing too terrible). No one forced me to switch to Unity, so I didn't. For me, Linux is more usable and stable than it's ever been. I also seem to see more people running it on the desktop than ever before, and in fact stats from Net Applications show a 50% rise in market share in 2011: http://www.zdnet.com/blog/open-source/is-the-linux-desktop-a...

Inside the Apple-centric echo chamber of HN, it's easy to believe that all the developers have moved to OS X and desktop Linux is dead, but I disagree. Despite its problems, desktop Linux is secure in its (small) niche.


Honestly, I like Unity. Or more accurately, I like where they're going with Unity. And I would like the thing itself if it actually worked reliably. But here I am, running the stock version of the biggest Linux distro on popular hardware, and I have to log out and in or reboot roughly daily.

I don't really believe developers are moving to Apple gear because of HN. I believe it because last time I went on a hiring spree, a lot of people said, "Oh, you're running Linux? Do we have to?"


> I like where they're going with Unity

Fair enough, but why not wait to use it until they get there? Daily reliability problems seem like a lot to put up with just for some cool ideas.

Also, I'm not sure how your interviewing experience supports the conclusion "Linux is dead/dying". What it does seem to suggest is "Linux is unpopular", but that has never not been true (on the desktop).


I'm using it because that's part of the default install. At one point I had the time and inclination to spend futzing with stuff to get it to work, but that's not where I'm at. If something isn't ready to use on a desktop OS, they shouldn't ship it.

My previous interviewing experience wasn't like that say, five years ago.


Then you haven't heard of Xubuntu? It's an alternate flavor of Ubuntu that has Xfce as the default install. Here's an iso that will get you basically the exact setup I have, no futzing required: http://xubuntu.org/getxubuntu/

Already installed stock Ubuntu? Install the xubuntu-desktop metapackage (in Synaptic or 'sudo apt-get install xubuntu-desktop'), log out and select "Xubuntu Session" from the login screen. That's it. I did this on my laptop after realizing I didn't want to switch to Unity. (For bonus points you can remove the Unity desktop apps, but that's hardly necessary unless space is at a premium.)

I don't think I'd put up with Linux myself if it was as much of a pain as you seem to think. For me, Xubuntu has been no trouble at all.


What are you using for sound? I use Pulseaudio on my desktop (custom built with a M-Audio Delta 2496 Pro soundcard) and on my laptop (crappy Lenovo G575) and I don't have any problems at all.

All I do is have this line in my .xinitrc:

   [[ -x /usr/bin/start-pulseaudio-x11 ]] && start-pulseaudio-x11 &
And it works perfectly! If I want to change my volume or manage which device it outputs on, I just run pavucontrol


What distro are you using that you are having to start pulseaudio manually? I swear I thought it was default everywhere.


He might be using Arch or Gentoo, or another of the more "DIY" oriented distros. Last time I used Gentoo you still had to really go out of your way to install pulseaudio, though that was four years ago so I don't know if that has changed.


It's Arch, and it's really rather easy to install PA

  # pacman -S pulseaudio pulseaudio-alsa
And if you're ok with having GConf installed (ugh)

  # pacman -S pavucontrol paprefs


I'm using Arch, but this applies to any distro if you don't use a DM and instead use xinit.

I much much prefer the simplicity of only using what I at least partially understand - the more magic (ie a DM) the more shit to go wrong that you don't understand.


It is the default in all mayor DEs, but he might have a custom X session, using a standalone window manager and extra tools. Or he could work in the console.


Many of the problems people have with Linux have nothing to do with Linux itself.

When I first tried to use Linux about 7 years ago, wireless drivers were a huge problem. Manufacturers didn't provide assistance--not much Linux distros could do about it at the time.

These problems are simply inherent to Linux being a minority platform.


> Many of the problems people have with Linux have nothing to do with Linux itself.

I disagree.

What the OP is complaining about is the same thing Miguel was complaining about, and it's the exact same thing JWZ called out 9 years in his CADT rant[1]. It has nothing whatsoever to do with manufacturers failing to provide drivers and everything to do with attention-deficit devs never wanting to knuckle down and do the hard, unglamorous work of long-term maintenance and bug fixing.

Working systems (with known bugs) are thrown out and re-written as new, incompatible systems with even more bugs. Everything breaks every time some idiot decides that they'll rewrite the audio/desktop inter-op/init/logging/whatever subsystem because This Time It'll Be Done Right™. This perpetual treadmill of half-working betas never ends.

It gets old.

[1]: http://www.jwz.org/doc/cadt.html


OpenBSD actually made a huge contribution here by reverse-engineering binary blobs and writing open-source drivers that could be maintained and debugged, which eventually made their way into Linux. At one time, the wireless support on OpenBSD was vastly superior to Linux. (This is a bit of a tangent, but I think props are due.)


> These problems are simply inherent to Linux being a minority platform.

I suspect part of the reason people spend so much time with advocacy is that popularity does pay off, long term: popular platforms get more support, more applications, and more other people who can help you with your own problems.


>So in that light, I think the idea that the Linux desktop got "killed" is sort of missing the point. It's no more moribund now than it was before.

Oh, but it is. Because it has lost a lot of momentum that it had, momentum that was coming from the "we're gonna overtake MS and win over the Desktop" feeling prevalent at the time.

Heck, the guy behind GTK complained recently that he is just one man taking care of the project. The full GUI foundation for Gnome, and one that is far from feature complete at that, and it only has one guy working on it.

It has also lost a lot of people and companies associated with it at the time betting on this possibility [of it winning the desktop]. Most companies nowadays support Linux development only for the server stuff, but it wasn't always so. Miguel moved on, Ximian moved on, Hazel moved on, Rasterman moved on, etc etc. Even Adobe quit developing Flash for it.

And it also lost a lot of "alpha geeks" to OS X, which wasn't even commercially available at the time (1997-2001). This "desktop UNIX" come out of nowhere and it was it that did what Linux was supposed (and expected) to do, ie eat into MS market share. Well, even OS X didn't eat that much, but 15% is still a lot.

>He's also an inconsolate "platform fan". Much of his early work was chasing Microsoft products and technologies, of course; now he's an iPhone nut apparently, and that doesn't really surprise me.

You're saying it like it's a bad thing. Gnome (or Linux for that matter) are also platforms.

And to be frank, his early work was not "chasing Microsoft products and technologies". His early world was Midnight Commander, Gnome, Gnumeric, and Evolution.

He indeed like the component model (not the Window platform itself) when it was presented to him at a Microsoft visit, though (the story of this visit should be up there somewhere). But what's not to like about it? A good, clean, component model of sorts was also needed for FOSS, maybe still is.

The phase you describe, IIRC, was several years _later_, when he saw the .NET platform and got hooked.


I really didn't intend it as a swipe against Miguel, but I think it does inform his perspective. And I meant "platform fan" in the advocacy sense: he has a tendency to "fall in love" with favorite products. That's not uncommon in the general population (it's pretty much the norm at HN!), but it certainly is among core Linux people who tend to prefer doing new things in different ways.

> His early world was Midnight Commander, Gnome, Gnumeric, and Evolution.

Clones of Norton Commander, Windows, Excel, and Outlook. To be fair, Gnome 1 wasn't really a "clone" (though it did mimick more than innovate) and mc was chasing a Symantec product, not a Microsoft one.

But to claim that these were innovative new projects is silly. Miguel's career has been one of seeing something he loves in an existing product and duplicating it in his preferred free software environment. There's no shame there. But it's absolutely the same thinking that drove the Mono project.


Pedantically speaking, I wrote Gnumeric not because I loved Excel, or enjoyed using it or because I was trying to be innovative. In fact, I did not even know how to use a spreadsheet at the time.

Gnumeric was the product of the mood in the early days of Gnome: we need to provide this to have a complete desktop offering and we were talented hackers that could get it done. So I did.


I forgot to mention.

Once I started, I enjoyed writing a spreadsheet every second of it. It was both a very educational process, but also the one that made me grow fonder of strict compiler warnings and errors. And from this experience, most Gnome software after this was compiled with warn-as-error.


Well, I'm glad you did write Gnumeric. The graphs module with that tree like properties box is really nice to use.


Jody Goldberg and the Gnumeric community deserve the credit for the graphics module, not me.

But I am glad you enjoy Gnumeric.


Whoever did the design of the graphics module was cooking on gas. The whole app is light and nice.


>And I meant "platform fan" in the advocacy sense: he has a tendency to "fall in love" with favorite products.

Well, for one, C#/CLR was not a favorite product when he fell in love with them, were they? At the time lots of people thought MS was foolish to try to overtake Java, and it took a couple of years for people to accept and like .NET.

>But to claim that these were innovative new projects is silly.

Sure, they weren't, but it's not like Linux in general has given us much innovative things in the desktop space.


> Sure, they weren't, but it's not like Linux in general has given us much innovative things in the desktop space.

O.o

Two off the top of my head: workspaces, app stores.


Workspaces? You mean virtual desktops? Those date back 1986, implemented at Xerox PARC (and patented!).

As for appstores, I assume you're talking about the packaging systems, which are similar only if viewed from 50,000 feet.

The packaging systems were not stores. You couldn't buy packages. They didn't serve as an agent to sell other people's personally submitted packages. They had "apps" but no "store". The applications weren't sandboxed to make it relatively safe for them to sell 3rd-party submitted packages. The applications were expected to be open-source.

The only similarity to the app stores is that they had a repository of software and an automated installation system used to install it.


You got me on workspaces, I wasn't aware of the Xerox implementation.

I didn't mean packaging, I meant app stores, see Lindows and CNR.


Wasn't CNR just an subscription-based package repository service?


No, it was an app store like we know them today. Here's article from the founder (also the founder of MP3.com):

http://www.michaelrobertson.com/archive.php?minute_id=340


Plus the dependencies. With app stores you (usually) get one program and that's it. It works. With package systems you have to install several dependencies for each, sometimes an absurd number (say, 100 packages to get Gnome). And then there's dependency problems (had my share in Debian at older times).


At the core, dependencies are a good thing, although there's downsides to that as well.

I have no experience with desktop app stores, but you wouldn't get Gnome and similar large and complex software from one, would you? Which makes your comparison pretty unfair.


Well, I got Mountain Lion from the Mac App Store, which is even bigger than Gnome (kernel + userland + stuff).


You can't name innovative things amongst a crowd that rejects that innovation can happen. They'll tell you someone else did it earlier, even when they didn't.

For instance, cars were not an innovation according to hacker news, because Horses!


>Two off the top of my head: workspaces

Much older than Linux.

>app stores

You mean package managers. Similar in functionality (getting programs) but not the same thing. Both have nice things going for themselves though.


I've used Linux for years and have never had these problems. I think the issue is that getting everything working requires a deep understanding of each component and the system as a whole. If you just follow advice on forums, you will make things worse because you're doing things you don't understand to a system that you don't understand. That's not going to lead to success. You need to be able to think critically about what's wrong and what needs to change, and then execute those changes. No, that's probably not worth doing if you already like Windows or OS X. If you don't, though...

(And, there are of course Linux-based systems that were built by someone controlling the whole experience, and those work really well. Android and ChromeOS come to mind, though those aren't really desktops per se.)

The other day, someone here was complaining about udev. It has ruined Linux forever, or something. I have a different experience: udev has made my life very easy. I have a rule for each device I care about, and that device is automatically made available at a fixed location when it is plugged in. For example, I have a rule that detects a microcontroller that is waiting to be programmed with avrdude in avr109 mode that symlinks the raw device (/dev/ttyUSB<whatever>) to /dev/avr109. I then have a script that waits for inotify to detect the symlink, and then call avrdude to program the microcontroller. A few lines of shell scripting (actually, it's in my Makefile), and I can just plug in a microcontroller, press the programming button on it, and everything just works. No screwing around with figuring out which device address it's assigned to. How do you do that in Windows?


I have used Linux as my main desktop and laptop OS for 17 years, and see these problems frequently. It used to be a system I could keep completely in my head, and understand what was happening and why. But from maybe 2005 onwards there has been a persistent and accelerating trend of replacing the old working and transparent (though possibly a bit baroque) infrastructure with fancy new components that are completely inscrutable. And the lack of transparency gets worse and worse the thicker these layers of garbage get. It's mostly fine as long as things work. But things never seem to work if you have a configuration that differs even a bit from the default.

Want to log in from a terminal and start X? Well, too bad there's some new infrastructure this month that makes sure you'll only get access to the sound device when you log in from a properly configured gdm. Want to modify the keyboard layout? Whoops, the xmodmap format that had been stable for a couple of decades now changed for the third time in a year. Want to add a tmpfs on /tmp to fstab? Well, too bad. Some implicit and undebuggable circular dependencies in systemd will make the system unbootable.

And the sad parts are that the problems this new infrastructure is supposed to solve never actually existed. "Great, now audio doesn't work at all. But if it worked, it would have full support for network transparency". It's easy to understand why the problem exists - creating something new tends to be more rewarding than working on the old stuff. It's much harder to see how to fix this madness.


Move to a simpler distribution that won't bother you. I've been on Arch for the last few years because once I set it up, new updates won't add new "functionality". It isn't perfect, but I can keep the entire system in my head. Nothing opaque, nothing inscrutable -- the design philosophy is KISS. It won't hold your hand, but it also won't get in your way.


All the problems with Linux over the last five years can be summed up in one word: Pottering.


PulseAudio doesn't depend on GDM, you just need to create a config for it; What I do (on fedora) is move the default.pa from /var/lib/gdm/.pulse/default.pa to ~/.pulse...

You can't expect everything to work when you tear out components then fail to configure things properly...


I wasn't tearing out components. I was doing everything exactly the way I'd been doing forever, and it no longer worked. Which is bad in itself, but maybe it's understandable that niche usecases break every now and then. It's just that it's happened so often and for so many parts of the system that it's hard for at least me to think it's isolated incidents rather than a cultural issue.

The truly toxic part is that every single transition adds complexity and reduces transparency, making it harder and harder to understand the system. It's just not the breakage alone, or the complexity alone, or even the lack of transparency. It's the combination of all of those.

In the start of this thread jrockway proudly says that all you need is deep understanding all the components and the system as a whole. Back in the day this was not actually an unreasonable thing. But it's been getting less and less reasonable for a long time.

(Incidentally the audio example in my original message wasn't even directly related to pulseaudio. It was a few years back, but IIRC it was some daemon tweaking the device permissions, and something else adding users to a special group in the GDM login path but not the console one.)


Well, GDM tends to handle a lot of initialization that it shouldn't be the one doing, using a distribution that doesn't assume you're using GNOME may help with this.

About the transitions supposedly adding complexity; I don't know about you, but systemd, for example, has greatly simplified configuration and management of services, mountpoints, timers, and all sorts of things.

I use awesome in Fedora 17 on some early-2010 entry-level consumer intel hardware, everything is zippy, easy to configure, and doesn't break on me all the time; I feel bad for you man.


It sounds like you're agreeing with the article - things work on Linux when you have a deep understanding of them. While it's good to encourage a deep understanding, the group of people using any given OS will always follow a curve of some sort. There will always be beginners; there will always be people who know just enough to be dangerous; there will always be experts.

Requiring everyone to be an expert will prevent Linux from being a dominant OS, because most people do not care enough to gain a deep understanding of any OS, so they'll pick one that's easier to learn the easy bits.


You'll hit the same problem with any OS. OS X will just get slower and slower, and Windows will become infected with malware. General purpose computers are not easy to use, yet: with the infinite flexibility they provide, there's infinite opportunity to fubar them. The average Linux distributions exposes the flexibility by default, making it seem hard to use. But if you switch to a less-flexible Linux, like ChromeOS, many of the problems go away. So I don't really understand what the article is trying to achieve other than trolling by stating the obvious in an inflammatory manner.


"I've used Linux for years and have never had these problems."

Agreed. That is two desktops and four laptops worth in a decade.

"I think the issue is that getting everything working requires a deep understanding of each component and the system as a whole. "

Not so sure. I'm an end user and tend to just shove the CD-ROM in and cross my fingers. Most 'stuff' works. My stuff is simpler than yours however (audio interface, cameras, keyboard (musical) controller)


IMHO; Android and to a lesser degree ChromeOS (as well as other similar platforms) don't quite count as "Linux" for purposes of "the Linux desktop" as they've replaced nearly everything above the kernel with their own stack[0].

[0] Last time I used ChromeOS it was still mostly in line with ordinary Linux (thought it might have been partially Gentoo based?); but a rather impractical one as it's less than simple to run [things that aren't Chrome]. Heard about Xorg getting replaced at some point, I'm unclear on if that actually happened.


So... one of his key problems is that when (not if) things break, you need to learn more than you ever wanted to know about the low-level internals of an OS to fix it. And you say you've never had that had that problem; you should have a deep understanding of each component and the system as a whole in order to be able to use Linux.

But that's exactly one of the major problems he's complaining about!


It's a different philosophy. Some of us come at it from the perspective that we want to know how our machine works. And when something goes wrong, we want to dig into it and fix it ourselves. The monolithic and opaque OSs that "just work" aren't for us. Oddly, the only place I've seen this attitude described/explained is in Zen and the Art of Motorcycle Maintenance.


CLOSED: Works on my computer


JWZ identified the issue Miguel discusses in this post ten years ago, he even gave it a name: CADT

http://www.jwz.org/doc/cadt.html

Also, part of what killed the Linux desktop was Miguel and his total lack of understanding of the unix philosophy which drove him to create abominations like BONOBO. D-Bus is not much better either.

That he fell in love with an iPhone goes to show he didn't fully appreciate the value of open source either.

We were just yesterday commenting with some friends in #cat-v how Evolution is one of the worst pieces of software ever created, and Evolution is supposedly considered by Miguel and co to be the epitome of the Linux desktop.


How does falling in love with an iPhone show that you don't fully appreciate the value of open source? Tell me about any good open source handset that came out before the iPhone. OpenMoko? I mean, you really had to love open source to put up with it.

Yes, there is still Android but, given how modified the usual handset is, you can't buy a free and open Android on the market as well.

So, where is the problem in using a closed device when "open" just doesn't deliver?


There is a big difference between using and loving.


Which leaves your point unexplained. So, let me reply in an equally enigmatic fashion:

There are many reasons for which a thing can be loved.


That you can't understand a love affair with the iPhone just means you don't appreciate good design.

I'm not a fanboy but I did envy my wife's iPhone for years. I like my Android better today but what choices did you really have when the iPhone came out? It was the only game in town. I was chained to a Blackberry is the only reason I didn't have one.


Thanks for the link!


Speaking as someone running a Linux desktop (and am writing this on one) there's not much to say other than I agree. I run linux because work gave me a PC and there's no way I can write software on Windows. Of course we all have servers managed off in the corporate cloud somewhere that run ssh/vnc etc, but there's no way I wanted to install putty again or miss out on the unix commands that make (work) life more enjoyable, so I installed Linux, because I write server software, and client sometimes, but browsers make the operating system moot pretty much. There's more variation between browsers than between operating systems - mobile aside. And when I need to try something on Wintel I spin up a cloud instance and use vnc.

When i'm not on Linux I run OSX everywhere else (and IOS) because its unix-like (is) and because it works so well. I am sure Windows 7 and 8 are great, but I doubt they have gotten rid of c: or \ as path delimiter or any of the other nonsense that DOS introduced (copied from PIP) back in the dark ages. why should they, MSFT still runs DOS apps so they aren't going to change and choosing between OSX and Linux on a non-work desktop is a no-brainer, Netflix, Photoshop etc etc etc...


/ works as a path delimiter on Windows. And Linux runs DOS apps just fine too, in essentially the same manner that modern Windows does. (Emulation.)


Why is / inherently better as a path delimiter than \ ? DOS was built to be compatible with VMS and other DEC OS's, which used / as a switch character.


If you could arbitrarily pick any character, both are as good as each other. However '\' is currently an escaping character in nearly every program that needs such a function, so as things currently are, '/' is better. This begs the question of when '\' as an escape character became mainstream though.


Because nearly every programming language uses \ as an escape, which means in string literals that you have to double escape the \ which is error prone and ugly.


you took the words right out of my mouth - ahh putty what fun :)


I think the article does a great job of explaining the problem, but doesn't explore the ramifications far enough.

Let me give an example: a few months ago, a new version of Skype was announced for Linux. I was excited, since I used Skype 2 for Linux but then it stopped working for me and I couldn't be bothered to fix it. But if you go to the Skype for Linux download page, you will find a few downloads for specific distros, then some tar files which are, statistically speaking, guaranteed not to work.

Long story short(er), I still don't have Skype working on my desktop, because my distro isn't in the list, I can't get one of the other distro packages to work on my system, and of course none of the statically-linked binaries work.

(I could almost certainly get it to work if I was willing to install 32-bit binary support. But it's 2012. If your app requires me to install 32-bit binary support, I don't need your app that badly.)

Steam for Linux, recently announced by Valve, will run into the same problem. I suspect it will actually be Steam for Ubuntu and Debian, possibly with a version for Fedora, assuming you have the proper libraries installed and are using the right sound daemon and graphical environment.

But if big-name software comes out for Linux, hopefully distros will get in line. Do you want to be that distro which can't run Steam? Doesn't really matter if you think that OSSv4 is superior to ALSA and PulseAudio...if Steam requires the latter, you will toe the freaking line, or disappear into obsolescence.


What distro are you running? It must be sort of out there that you're having such difficulties. I'm running Arch which is not widely supported, but user submitted build recipes (think brew on OSX but with way more options) work 95% of the time (e.g., no trouble with Skype, Spotify, etc.)


I'm running Arch as well, though I don't use the AUR. Stuff tends to crustify too much and too easily in there.


Once you install yaourt or packer though it's really easy to at least see if the AUR package works (Skype works fine, e.g.). If not then maybe you're on your own, but if you're having trouble it can still be useful to see how someone else built it in the past.


I'd agree with a whole lot of what's said here, but also add:

One of the big thrusts of the Linux desktop wasn't simply dominance itself, but for it to simply not matter what you were using on the desktop. The Linux desktop fought to produce the first cracks in Windows hegemony a decade ago, but the final push came from the rebirth of Apple and the rise of the smartphone.

Today people happily do their normal productive or recreational tasks from a variety of computing environments: Windows, GNOME, Unity, KDE, OS X, iOS, Android, et al. Probably the majority of (Western) web users use at least one non-Windows internet device.

During the golden age of the Linux desktop everything seemed predicated on reaching exactly this point -- that you wouldn't need Windows, and then, by virtue of competing on a leveler playing field, the Linux desktop would ascend.

But the Linux desktops didn't "scate where the puck is going" -- or their attempts at such missed the mark. By the time we reached the era post-Windows dominance, the Linux desktops weren't positioned to take advantage of the new playing field dynamics. The rest of the industry isn't even all that concerned with the desktop wars anymore. It stopped mattering very much -- and ironically, that came around to bite the projects in the ass that first got the ball rolling.


I never understood this. Why would market share of Linux on the desktop matter? I've always viewed Linux on the desktop as something for power users and developers, and thousands of said power users and developers are continually developing and maintaining multiple distros and thousands of applications. It's not like it's a stale and abandoned paradigm that's left to die.


Was gonna say exactly the same. People thinks that if it's not popular among the masses it's "dead". Totally wrong, since it covers a completely different niche.

The author seems to forget that some people actually enjoys configuring and hacking their systems in detail. Also there is the people who hates using the mouse, and wants to do everything with the console and keyboard.


I think it's because even power users enjoy doing "regular user" stuff sometimes and it's nice to be able to do it all on one machine without resorting to dual boots or VMs.

Having a bigger market share means that hardware/software vendors are more likely to consider supporting Linux to at least some degree.

For example I like to listen to music while programming, so it's nice that Spotify is available on my dev machine.


Linux by no means should be limited to power users. It should be (and is) a universal OS.


> (b) incompatibility across Linux distributions.

This is completely missing the point - a statically compiled end-user binary should be compatible across all distributions of Linux, using the same version of the kernel or any newer version.

The only caveats to that are (a) hardware and (b) poorly-packaged software.

(A) is the fault of hardware manufacturers and is increasingly not an issue these days anyway; driver issues are becoming increasingly rare.

(B) is easy to solve for any open-source software, as it is the responsibility of the community for that distribution to provide the appropriate packaging. They prefer to do it themselves. And they're good at it - it gets done!

If you want to ship a closed-source binary on Linux, just make sure you don't dynamically link it against any libraries that you don't also ship with the binary. Problem solved.

Honestly, I can't remember one single instance ever where I have run into end-user software that will run on one distribution of Linux and not another, as long as that principle was followed.


Many sophisticated libraries on Linux uses dynamic modules or require components that are configured as part of the system.

Consider D-Bus, if you statically link, but the system changes the format or location of D-BUs configuration files, all of a sudden your app no longer works.

So in theory, yes, this could solve some of the problems. But it requires a massive effort to make the Linux desktop libraries static-linking friendly, which they are not.


Like libc, or libX11 and apparently ld-linux.

Why not just ship a chroot jail to run it in in case some of those statically linked system libraries read config files which might be under a different path or in a different format?


Software compatibility in OS X?

A lot of applications break on newer versions of Mac OS X. That's why there are websites like http://roaringapps.com/apps:table

Also, there are a lot of "transitions" that Apple loves doing: PowerPC -> Intel. Java -> Objective-C. Carbon -> Cocoa. 32-bit > 64-bit. Access everything -> Sandbox.

See also Cocoa docs: "method X introduced in 10.5. Deprecated in 10.6".

I have a few devices that don't work in 10.8.

Basically, what I'm saying is that OS X is a bad example for backward compatibility. Windows is much better at this. Open source software is much better at this.


What killed it is that it didn't have a huge and multi-billion dollar company betting on it (on the desktop) like Microsoft and Apple had, even Apple with its billions is still around 5% market share worldwide so having 1% is still a great accomplishment when you think that it had no support from huge corporations.

Now take the mobile world for example, Linux on mobile had been around for a decade but it never really took off until a huge company like Google decided to throw its billions of dollars and its great ingenuity at the task. Getting an OS to be popular is just incredibly difficult and it needs way more than just good driver support and/or good software. It needs marketing, talking to manufacturers, dedicated and well payed devs, designers, UI and UX professionals, sales, R&D and so on and so forth.

Focusing on the technicality of drivers and API is typical of us devs, but it has nothing to do with why Linux didn't take off on the desktop, sure Linux did fail because it couldn't do any or some of that well, but why couldn't it do any or some of that? Because it didn't have a huge and focused company pushing for it. How many popular desktop OS are there? Only 2, I think that's enough to show that it's incredibly hard to get into that market and that only a huge company can make it. Also, let's not forget that Windows was good enough and there was not much Linux could do to attract users, in fact this is still true and probably why even OS X is still at 5%: Windows is good enough and it's the de facto standard used by +90%. Having the best UI and UX in the world like OS X doesn't help that much either.


Redhat and Canonical are big companies that did bet the house on Linux. A few years back I was confident Canonical was what desktop Linux needed to get its shit together. And today I'm typing this on a Zareason laptop that's supposed to provide 1st class support for Ubuntu (even has an Ubuntu key), and the mic doesn't work, Fn keys don't work, youtube videos appear in monochrome blue, and it likes to boot itself up randomly from time until the battery drains...


> Redhat and Canonical are big companies that did bet the house on Linux.

No, RedHat bet the house on server Linux, never on desktop. Canonical did bet the house on desktop linux bit it's a tiny company that has yet to turn a profit (or a significant one).


To fix the blue youtube videos: Create a file /etc/adobe/mms.cfg and put the following 2 lines in it:

EnableLinuxHWVideoDecode=1

OverrideGPUValidation=true

Some people will probably tell you all kind of reasons why this is bad, but it works.


You're one user. I've been using the same System76 laptop at work and at home for 3+ years. Still works flawlessly (fn keys and all).


I'm typing this on my work laptop running a linux desktop (Ubuntu FWIIW). Our engineering servers at work run linux and, as a convenience, have the desktop installed. As many of my co-workers run linux desktops as OS-X desktops (and the engineers running OS-X or Windows have VMs running linux... desktops).

When I go home, I'll be using my personal laptop running linux. My wife and kids run a netbook with a linux desktop.

The linux desktop may be dead to Miguel, but it works just fine for me, a lot of other people in my life, and a lot of people in the world.

<shrug>


Optimistically speaking, i moved from mac/pc to linux 9 years ago on desktop dragging dozens of people with me. My experience so far is everything gets better year by year. Yes, Apple took away some shine out of Linux since a couple of years, but I see this as a gain, because those people will be unixified and will be better adapted to Linux setups in the next decades.

I never consider OS/X despite some fanciness; it is limited in choice of hardware, it is totally dead on server side, its unix-ness was/is crappy, despite being developed by the (the most valuable) cathedral in the world. its support for open source dev tools is miserable, despite some recent improvements, its game support is miserable (compared to windows) and it goes worse in openness day by day. Just a crowd of ordinary users, waiting to be sold trivial app store apps maybe an appeal to some developers, but OS/X appears like a toy casio personal organizer OS from 80s to the DIY/Linux developer desktop users.


Yeah, right, because OSX cares so much about backwards compatibility. They care so much that they actively go out and intentionally break APIs, like say when CGDisplayBaseAddress() stopped working in Lion, breaking fullscreen in every single SDL-based game (and by "breaking", I mean the game will actually crash when attempting to enter fullscreen.)


Arguing about the niceties of the UI is all well and good but the actually problem is far more fundamental.

What killed the linux desktop? Drivers. Mostly graphics drivers but some others as well. Who cares if the UI isn't ideal if the damn thing can't sleep and wake up properly, or if it spazs out every time I plug in an external monitor.


Hmm.. Drivers seem to work just fine here. Actually up until now I didn't even notice the Linux desktop's supposed to be dead. Looking at it right now it seems very much alive.

Stupid jokes aside, I always had a feeling that the Linux desktop was actually gaining users in the last couple of years, especially since Ubuntu came along and truth be told it's still the best operating system I've been using but that question is largely open to taste anyway.

For the last couple o' weeks I had to use Windows 7 at work and oh my god, what a pain in the bottom it is! I've had a couple of looks at OSX and never liked it either. Then again my linux desktop isn't exactly standard either - but that's what I like about it: customizability. I'm the kind of guy who likes to tinker with the system until it's just about right for me. The last time I set up my laptop it took me 3 or 4 days to finish but it absolutely paid off. The result is something I find aesthetically pleasing and extremely usable.

Linux has a couple of show-off features and facts 'n' figures like running most of the Top 500 and second-to-none package management but it's real power is that it can very well be all things to all people. It's been said a million times but Linux gives the user a choice, and I honestly value that a million times more than any nice OSXesque UX or Windows-esque games support.

The catch is that I'm definitely not the average user who wants things to "just work" (although they rarely do).


That's exactly what TFA says! Indeed, it points to something even more fundamental: the APIs (both userland and kernel driver space) move too quickly and are fragmented. That's why both drivers are difficult and why desktop apps are difficult to support.


>What killed the linux desktop? Drivers.

That seems a weird way to phrase it. This is not a new thing, and generally drivers have gotten better over the years. Saying that drivers inhibited the growth of Linux makes sense, saying that they killed it -- not so much.


absolutely agree. I love linux, its a great os but drivers are a nightmare, honestly, I've never had any linux distro, where I could comfortably buy an IO device and not have to worry about compatability issues. This to me is the biggest problem by far with the os(s) in their current state.


What do you mean by 'an IO device'? I can count on one hand the number of driver issues I've had, and none of them qualify as IO devices.


What other devices you got?


monitors/graphics cards,printers,modems,network cards,joysticks,etc.


Are you using really old hardware? I've never had any problems with any of the above. Monitors should be plug-and-play if you're using a modern DE (and if you're not, you probably know how to set them up). If you have separate graphics card, you may have issues with battery life, but that depends on the model, and they still work.


I can sympathise with you about external monitor issues. In my office, we all have laptops with external monitors. When we have meetings, all the Windows/OSX users simply unplug the external monitor and take their laptops into the meeting room. The linux users normally just take a pad and pen, as it's too risky to attempt unplugging the external monitor while linux is running.


When have you last used Linux? Drivers haven't been an issue anymore for years now.

I don't actually see any problems with Linux. It's just not meant to be.


As a desktop OS, tried it again 3 weeks ago. Fairly standard setup, Asus motherboard, core i7 3770, 32GB of ram, one dell display@ 2560x1440 one at 1920x1200 in portrait, base AMD 7xxx series gpu, newest ubuntu with AMD binary drivers. First the graphics driver locked up on rotating a screen, that required reboot, even switching to a non graphical tty didn't work, then it took but z-sync was completely disabled on the portrait display and by completely i mean it was unusable with 2-3 inch tears on moving content. This of course is a known issue with Xorg being archaic.

I'm sure you'll blame this on AMD's drivers or shoot out some other random technical reason. While i understand the technical issues, I just don't care, I have far better things to do than care about this shit. It flat out doesn't work and I don't care why.


If you don't care and aren't willing to listen, you cannot have a well-informed opinion and cannot pretend you do.


I think you've completely missed my point.

I spent 5 years using linux as my only OS, I am very well aware of the issues it has, specifically in the area of display management. The reason I stopped using it is that I don't care to mess with it any longer. I've spent hours mucking with the internals of Xorg and writing sleep/hibernate scripts in an attempt to duct tape something together that works. I've spent hours trying different combinations of kernel + Xorg + driver to figure out which one won t randomly peg a CPU core at 100%. I've spent hours researching parts to look for compatibility, I've wasted $ on parts just to work around issues with linux.

What I realized and why I have no interest in linux as a usable desktop anymore is that while this time spent may scratch a nerd itch, it is actually a complete waste of my time and detracts from actually doing something productive, like using the computer to produce, the very reason I own it.

So I don't give a shit at all why it doesn't work anymore. If it doesn't work, it doesn't work, I don't need to do massive research to say "it doesn't work". I don't need to be "well-informed" to say it doesn't work, not working is pretty obvious. In fact doing research goes completely against my goal, which is to spend the least possible amount of time thinking about my computer and the most possible time producing.


I used to be really into the whole free software thing, but have mellowed with age.

However, no way in hell anyone will get me to switch to Mac OS. I am simply too enamored with having an environment that I can hack on if it strikes my fancy, as well as an environment that I can customize how I want it. Despite all its flaws, it still does focus follows mouse pretty well, and not having that would drive me batty.

Also, Apple is an 800 pound gorilla that has always been about Being In Control. The Samsung lawsuit wasn't anything new:

http://en.wikipedia.org/wiki/Apple_Computer,_Inc._v._Microso...

I just don't want to be part of that kind of walled garden.


I'm writing this from my laptop which is running Ubuntu as its desktop.

I don't really see how the Linux desktop is dead. I've been running the same OS on this same laptop since 2007. The only upgrade I've added is an SSD and an extra gig of memory. It's still pretty speedy and I've never had any problems.

I use a Macbook Pro with OS X at work because that's just what I was issued by default. I hate it. I hate the over-reliance on the mouse, on gestures, the abundant and tedious animations; I hate the crappy ecosystem of package repositories and how most of the packages are broken or completely mess with the system; I hate never being able to find where any of the configuration files are or where something is installed; I hate the plethora of ways you can start and stop services; the confusing GUI; the masochistic meta-key layout; the awful full-screen support; and the complete lack of customization options.

I've had much better experiences with the Linux desktop for 95% of the things I do.

Now before some OS X fan-person decides to point out how woefully misguided and ignorant I am, my point is that there are different folks out there who want different things from their desktop experience. Apple gets to decide top-down what that experience is all the way down to the hardware. I prefer a little more flexibility. I like being able to swap out my own battery or adding a new memory module when I need one. I like being able to switch from a GUI desktop to a tiled window manager. Some folks don't -- there are Linux distros that hide as much of that as possible. Either way there are plenty of options and I think that's a good thing. Competition breeds innovation and even though I don't particularly like Unity I am glad to see people trying new things.

The Linux desktop isn't dead. It may just smell funny. You may switch to OS X and wonder why anyone could possibly want anything else. I just gave you a bunch of answers.


There's room for many approaches, of course. While the perfectionism (or is it lack of pragmatism?) of Linux and its developers may well have held back its wider adoption on the desktop, there's a lot to be said for the its development community's single-minded pursuit of quality and correctness.

As well as Linux's presence in the data centre, witness the success of 'embedded' Linux: many TVs, routers, set top boxes and other bits of sealed-box electronics all run on it. It's broad in its scope because of the large team of divergent interests working on it, and it's able to support those systems because it's been well made as a direct result of that team's philosophy. Is it really so bad that the average Facebooker does't want to use it?

It really is very, very hard indeed to be all things to all men and no single system around today can make that claim. Linux has its place in the world of computing, just like Android, Windows, OSX and everything else.


There never was a "Linux desktop". Linux is a kernel. GNU is a set of utilities. And X11 is a mess.

Did you know that X11 is why we have shared libs (the UNIX version of "dll hell")? If not for having to run X11, shared libs really would not have been needed.

There are many window managers. Maybe too many. Too much choice for a noob. That selection or the pre-selections Linux distribution people make does not equate to "the" Linux Desktop. It equates someone else's configurations and choice of applications. It equates to having to fiddle with X11, whether you are just configuring it or developing programs to run in it. And that has always been extremely frustrating for too many people- constant tweaking; it never ends. This is like a brick wall to people who might want to try Linux, coming from Windows. You are inheriting a system that's been configured to someone else's preferences. (Same is true with Apple, but they have a knack for making things easy.)

I skipped Linux altogther and went from using Windows to using BSD. I've also been a Mac user. And BSD is way better than OSX, or any of the previous MacOS's for doing most everyday things: email, internet and secure web (ramdisk). Moreover it's flexible - you can shape into what you want - without this being an overwhelming task of undoing someone else's settings.

If you want a citation for the shared libs thing I will track it down, but honestly anyone can do it on their own. The historical research will do you good. Educate yourself.


An interesting observation is that tablets are becoming the new desktop and in that space linux, through android, is becoming a dominant player. In a way, the linux desktop is finally here and it's winning against both Microsoft and Apple put together.

All of the article's criticism of mainstream workstation distributions is accurate, of course. But it's important to note that those represent nowhere near the sum total of the linux user experience these days.


In no way is that "the Linux desktop" "winning" at anything, because (at long last) the navel-gazing of the Linux community has been pushed aside in favor of one group saying "this is how it will be" (and then some OEMs scribbling a little on the walls, but not much).

Android is only "Linux" when it's convenient for Linux advocates, but it's never "the Linux desktop".


> In my opinion, the problem with Linux on the Desktop is rooted in the developer culture that was created around it.

This developer culture DEFINES Linux. A fruit is either an apple or an orange. I couldn't have an OS with wonderful package management, developer tools, endless configurability AND a desktop Miguel de Icaza dreams of.


But I can have an OS with wonderful package management, developer tools, endless configurability AND a desktop Aaron Seigo and friends created.


The desktop wouldn't be configurable. Or not much more than Channing the background image.


This flame ignites periodically, and I'm always left wondering when exactly the Linux desktop died? Some have noted similar aspects already, but here's my 2 cents:

I'm on Linux now (GNU/Linux, maybe lump BSD in there too, I'm using "Linux"). I know plenty of users on Linux. I know plenty of users of Windows and OS X who run virtual Linux Desktop distributions for testing/development/security. I'm sure some of HN are running Linux.

Does Linux have the potential to enter the market as a third core option for desktop usage - not really. But why does it matter?

The problem with Linux is that there are too many choices. People who like technical choices and options trend toward Linux (needs citation).

John Q. ComputerUser isn't going to use Linux unless his geeky son or nephew installs it for him AND provides support. He can't get support anywhere else - because there are too many possibilities for it to be fiscally effective.

If/When something gets confusing or broken on Windows/OS X, you call JoeBob's SuperDuperPuter, and say it's broken. JoeBob asks, "What Windows version?" While he might need to poke and pry a bit to get the user to tell him he's running Millenium edition, once he gets that data, it's a pretty straightforward troubleshooting effort and fix.

If you call some mythical Computer Service group that actually supports Linux, and say your machine is broken, they would need to know a LOT more about your system just to figure what they need to do to start.

Distribution? Parent Distribution? Shell? Window Manager? Hardware? ...

I find generic computer service companies to be extremely expensive. To be able to provide even basic service for Linux in general, your techs need to be very familiar with more operating systems (emerge, apt, yum, zypper, pacman), and more core applications. Each service effort inherently takes longer. These factors pile up and everything becomes necessarily more expensive. It's downright impractical to support Linux generically. The support costs for one or two issues on Linux would far outweigh the cost of an upfront OS license and cheaper support for the end user.

Linux has (and will likely continue to have) a comfortable hold on the technically-capable DIY market. It may not be on track to step beyond that market in the desktop arena - but that certainly doesn't indicate it's time for a toe tag.


As someone who previously worked doing Windows support, actual "troubleshooting" on desktops is pretty rare apart from a handful of common cases.

Most of the time all that happens is virus scan -> backup -> reinstall.


Exactly - that's much more succinct than I put it. Comparatively, it's not often that a Linux issue is solved by those same steps (or any given set of steps - there's just more variation).


>And you can still run your old OSX apps on Mountain Lion.

Having been a small-scale Mac developer for many years, that really made me chuckle. Not since OS X 10.2 did Apple release a major upgrade that didn't break my apps and make me struggle to push an update out as quickly as possible to fix all the things that Apple broke. Apple has heard of deprecation, but they don't seem to really grok the concept.

If I had been developing for Linux, I could have simply tested on pre-release versions of the distros I wanted to support and would have been ready when the new versions were released. On OS X I would have had to have paid a prohibitive fee for that privilege.

In any case, this article made me happy. You see, for so many years, I used a Mac, and everybody said "Apple is on its last legs; the Mac will be dead in a few years". Apple had to scramble to compete, and that drove them to provide such a good product. But I knew that situation might not last forever, and I was right. After seeing the turn that Apple had taken over the last few years, I switched to an Ubuntu laptop six months ago.

It's refreshing, once again, to be using an OS that people are calling "dead".


Linux is too hard to configure; if the distro gets it right out of the box it's fine, but not otherwise. I started with Windows 3.1 in 1995, mostly used Slackware, and some Windows 95, from 1996 to 2000. Slackware and Windows 98 from 2000 to 2004. But from the time I got on the Internet in 2004 to the present I have mostly used Windows (98, XP, and Vista) because I have not managed to get any version of Linux that I have tried to connect through a dial-up modem. I have to admit I have only tried sporadically, since Windows just works, and my efforts to get some Linux distro to work have been so frustrating. (Note that though a frequent user, I am not a programmer or professional sys-admin.)

ADDED: jrockaway's comment, added while I was writing this, hits it just right: "I think the issue is that getting everything working requires a deep understanding of each component and the system as a whole." Which is what makes it so frustrating, even to very intelligent people who have other interests than computers in and of themselves.


Please ignore Icaza. As for the famous death of Linux on the desktop let me tell you something: IT NEVER HAPPENED. What are those people smoking?

I've been using Linux for the last decade and every year it gets better, more polished, more integrated, featuring a better design; I hear more & more people talking about it and using it. Linux is more alive than ever on the desktop!

Depending on your needs, Linux can make an exceptional desktop. Yes, true, it is not for _everyone_, but then again neither are Windows or MacosX.


The Linux desktop never did any better -- RELATIVE to the rest of the market -- than Ximian Desktop on Red Hat 6.2. If de Icaza wants to blame someone for the failure of the Linux desktop, I would point the finger back at him. He's the one that dropped out of the effort, became a Microsoft shill, and continues to push (the speciously patented) Mono to this day. Also, I don't know why he's lamenting a strong leader casting a clear vision on the desktop these days. Canonical is doing EXACTLY that.


Nothing "killed the Linux desktop"; it still thrives for those that want it, and it's steadily improving. It never came to dominate the market, and one can argue about the reasons it never displaced Macintosh (still less Windows). It probably has a lot to do with lack of a single, unified vision, and the market fragmentation caused by the different distros, and the lack of market pressure to ship, as it relies on volunteer labor, but I'm not going to presume.

Personally, I've been primarily a Mac user since the Mississippian superperiod, but I used an X-11 Windows(™) environment (on top of FreeBSD) for years at work. I don't miss it, even one iota, but I know plenty of smart people who prefer that sort of thing. De gustibus non disputandum est and all that.


When the iPhone 3GS came out, battery life tanked on my 3G. They fixed that, after some period, only to break the reporting in the firmware. Now the device thinks it's dead after a few hours. Replace the battery, same life time. At this point, I'll never expect and apple device to last longer than a year and therefore will not buy one.

Additionally, OSX is no linux replacement. Bash is completely different except for cd, rm, and ls.


funny; I am using a linux desktop right now. not dead yet.


I am using it too, because I like it. I didn't like Unity nor Gnome Shell 3 though; and as Gnome Classic lost support it became uglier. But now I found Cinnamon to be a better alternative to anything else. I am very happy with Linux on the desktop right now; I don't think I'd be happier with a Mac or with Windows, not at all.


by choice though, or out of work necessity?


I use Linux as a desktop OS out of choice. Originally, no - because I couldn't afford a MacBook, but I've learned to love it and I don't have any need to switch.

Linux desktop is an OS for people who are willing to invest time to get an utterly fantastic, fascinating experience. It doesn't suit modern instant-gratification culture, no, and yes, it takes experience and expertise to get the most out of it, but once you get there, it's bloody amazing.

I'd recommend having it as a small hobby as well as tool, so you can spend time learning and figuring things out and broadening your knowledge.

Disclaimer: Using Arch Linux with the i3 window manager. No, it's not for everyone, but I absolutely love it.


By choice here and have been using it for about 6 years now on the desktop and I have no plans of migrating anywhere else.


I have a Linux desktop at work and one at home. I actually see the modern Linux desktop as absolutely amazing compared to where it was a decade ago.

I also have a Mountain Lion MBP, but AFAICT, I will be leaving the Apple cart after it dies and I'll be 100% Linux.


Do you really think there are no people that use Linux by choice?


Choice here too - have been running Linux at home for almost 10 years and I never regretted it. There are almost no downsides, except careful choosing what hardware to buy and the lack of Photoshop (though Wine helps with that now).


I use Windows out of necessity at home to play my video games. I use Linux as choice as my IDE at work.


Both.

I need it for work, and I wouldn't have it any other way.


actually both. home and work. my children's friends think I am a hacker since my console looks like a hacker's computer.


Not this post again. Those who thought Linux can compete with heavily subsidized windows on Laptops or OSX with Apple's flashy interfaces are dreamers.

Linux has been for those that like to get dirty and it is doing that job quite well. Canonical came a bit late to the party and wasn't large enough to matter. RHEL just went after the servers. To make a fair comparison, Linux should have had a big player backing it strongly on the Desktops / laptops 10-15 years ago (like Google is doing now with Android). HP and IBM did their half assed attempts, but they were never really behind it completely.


I love Linux, and as a developer, use it as my main os (ubuntu). It is so easy to develop on, and it's package management is superb. I don't use the desktop per se, that much, and am usually command-line driven.

I have a Mac, and use it for some things, at times. It's nice, for sure, but I love the openness of Linux, even though, of course, there can be many very painful hardware issues (video, sound, etc), all of which I have experienced at one time or another.

I am wondering - I hear Google is working on a "Android desktop". Would that perhaps maybe change things regarding the "Linux desktop" a bit?


Android desktop might be excellent. However alot of the Android stack is not open source or free software.

An Android desktop will be a proprietary desktop built on top of an open source kernel.

It might be terrific, but it won't fulfil the dream of a free software OS and desktop.


Back in the day, before setting up Linux was a breeze, I got tired of mucking around with configuration and such just to get a usable Unixy desktop and environment. So the day OS X Jaguar was released I purchased a Mac.

Now if I need to fire up Linux for a project, (usually for a microcontoller or such hardware that needs C), a virtual machine or appliance that I can launch from Windows 7 does the job. This is also how I keep Windows 8 contained, safely in a virtualized box that I don't have to deal with it, unless I need too... ;)


The Linux desktop never got killed! It never was really living. As long as Linux is not sold on computers, it will never spread. Maybe things change in the future, Canonical is doing an insanely great Job bringing Linux to the masses. But personally I think Linux will take of in new markets (China, Brazil, etc), not in allready established ones.

By the way, in my oppinion only a small fraction is buying Macs because of OS X, it's the Hardware. Design and Usability of Ubuntu is a lot better than OS X at the moment.


First of all, the Linux desktop is not dead.

As I wrote on my blog recently:

"In the [past three years], Linux has grown — albeit slowly — in desktop usage. After nearly 2 years of no growth (2008-2010, lingering around 1% of market), in 2011 Linux saw a significant uptick in desktop adoption (+64% from May 2011 to January 2012). However, Linux’s desktop share still about 1/5 of the share of Apple OS X and 1/50 the share of Microsoft Windows. This despite the fact that Linux continues to dominate Microsoft in the server market."

It may be in third place in a desktop market with primarily three OSes, but usage has never been higher.

As I discussed in this article, most of the original reasons that stopped Windows / Mac users from using Linux years ago are no longer valid. However, the irony is that it's easier than ever to get by with a Free Software desktop, but harder than ever to avoid proprietary software and lock-in, thanks to the rise of SaaS and the personal data cloud.

I agree with de Icaza that the "Open Web" is more important these days than a Free Desktop. But the linked Wired article's conception of Open Web refers to things like HTML5, JavaScript and CSS. These aren't the problem. They are an open delivery mechanism, yes, but usually for proprietary software.

Modern SaaS applications accessible through the web browsers using open web standards are the modern equivalent of an open source Perl script wrapping calls to a closed-source, statically-compiled binary.

You can read more about my thoughts on this in "Cloud GNU: where are you?" http://www.pixelmonkey.org/2012/08/18/cloud-gnu


in a q&a round at aalto university in finland linus adressed the question why linux never took off on the desktop: the lack of being a pre-installed os. he mentions that without preinstalled operating systems there's now way to gain a significant market share in the desktop segment.

the whole talk by itself is very recommendable: http://youtu.be/MShbP3OpASA?t=23m45s


Thats just bullshit. The Linux desktop maybe isn't broadly accepted or mainstream, but I dont see the Problem in that- after all Linux remains a system for power users, even if some Distros want to change that. And there really is no better desktop environment for those people than the Linux desktop. Windows is shit incarnated, so lets not even begin to talk about it. What remains? Mac OS. Sure, it has a more accessible GUI, but not a more efficient one. I cant think of something more elegant than a tiling wm, be it awesome, wmii or xmonad. Everything based on moving a cursor just feels awkward in comparison to the simplicity of ~5-10 keyboard shortcuts. And tiling also means that I always have everything in front of me. Fumbling around to find some window is HORROR.

I think the Linux desktop simply has more options for experienced users. I simply see no way how I could be more productive with a GUI designed to cater to lusers.


It's still very much alive if you don't give a shit about normal UX conventions or popularity, and there are hundreds of thousands of excellent 3rd party applications that run perfectly.

It's getting really irritating when someone who's jumped ship to OSX declares it "dead" because they have a shiny iDevice and an expensive laptop.


Largely the same issues that killed UNIX as a viable desktop alternative are the same issues that are killing Linux as a viable desktop alternative: Fragmentation and lack of consistency across different distributions.

This is compounded by most distributions having a lack of centralized vision on how everything fits together. They are merely a collections of individual parts rather than a collection of parts that are designed to work well together and they lack the polish as a result. While the lack of centralized vision was fine for SunOS circa 1992, it simply doesn't cut the mustard in 2012.

Ubuntu seems to be trying to push such a centralized vision with Unity, but I fear they lack the clinical editorial willpower to make the hard decisions required to see it through to its ultimate conclusion.


Compare Linux to OSX makes no sense to me. Linux has always been missing features when compared to alternatives; the GNU system was actually written to emulate the alternative.

But the GNU/Linux project had a very different objective. Fighting for freedom. If it is still freedom the driving force, then we should encourage the enthusiasts and get back to work on improve Linux, as it has been done for the past years. By doing so Linux already reached the excellence in some fields.

If you're just competing on features, you'll be missing some great benefits and enjoyment. And to be honest, in terms of features OSX isn't that good either as Windows is still used by the majority for one reason or another.


Miguels affection towards his iPhone is a bit unconsidered and superficial.

But anyway, a more interesting question could be: What does it take to bring an ex-linux user and now happy OSX user back to linux?

I used Windows for 3 years, then linux for 2 years. During that time I did a lot of installations (mostly ubuntu and debian) on a lot of different devices. During this time, while fighting with drivers, minor display problems, and spoiled windows users I lost my faith in linux as a desktop os and switched to OSX.

I can just speak for myself, but this few points would bring me back to linux in no time.

Presenting Distribution "Utopia"

1. No X11 based display stack, it is replaced with something conceptually simpler (like cocoa).

2. (Multiple) monitor recognition 100% accurate. (Probably connected to Pt. 1)

3. The audio setup is not much worse then the one of OSX.

4. Throwing Gnome and everything that is based on Glib out. It's 2012 there alternatives to faking oo with C. Qt isn't allowed either.

5. Throwing APT out. No more dependency management for a desktop OS please. Then kill Perl as requirement for running an os.

Ahhhhh, I feel better now :-). This is the opposite of what Miguel demanded, he cares for backward compatibility.

When I think about it. "Utopia" would be similar to Android. No fear to throw old stuff out.

Android as a foundation for a new desktop linux?


1. If this is true, and it seems right to me, maybe some of the massive effort put into designing new GUIs for Gnome/KDE/etc should be put into hacking the look and feel of the OS X desktop?

Unsanity ShapeShifter hasn't worked since OS 10.4

and I know about

http://magnifique.en.softonic.com/mac - 10.5 only

http://www.marsthemes.com/crystalclear/ 10.7 support claimed, but it's not very radical. I'd love xfce's window look controls or a Stardock windowblinds.

I know Apple don't want anybody to do this. I know they will deliberately introduce changes that break hacks. But as I said, how can it be more effort than Linux?

-----

2. To try to prevent OSx86 hacks, DSMOS.kext uses crypto to prevent the system running essential UI elements like Finder, SystemUIServer, etc. Can't we build our own versions of those parts? http://en.wikipedia.org/wiki/Apple%E2%80%93Intel_architectur...

-----

3. Is this true?:

Linux desktop - dying, dead

Windows 8 - trying so hard to copy OSX/iPad/Springboard/Launchpad that everybody is gonna hate its TWO UI's! (dying?)

Mac - winning, won (by default?)


I both use KDE and OS X. Some details are different but the ideas behind are mostly the same.


This article hits so many sore spots right on a pustulent scar tissue.

I had run a Linux desktop (a Debian build mostly w/ KDE) for a while and kept getting hammered with random stuff breaking for random, and often poorly considered, reasons. I gave up and went back to running a Windows deskop with a X-server to pull up windows on my Linux box.

Then I went to work for Google and they did a really good job of running Ubuntu as an engineering desktop (calling their distro Gubuntu of course) and I thought "Wow, this has come quite a ways, perhaps Linux has matured to the point where there is at least one way to run it reliably. And so I installed Ubuntu on my desktop and tried that for a while.

For "using" it, it was for the most part ok if once every few days I did an apt-get update/upgrade cycle. For developing it was a real challenge. Pull in the latest gstreamer? Blam blam blam things fall over dead, update their packages (sometimes pulling git repos and rebuilding from source) to get back working, and now apt-get update/upgrade falls over the next time because you've got a package conflict. It is enough to drive you insane.


Proud Ubuntu user here. Ubuntu 12.04 is not bad at all. Supports the fancy font he used on his blog. Flash is working. WebGL is working. LibreOffice opens Word docs when I need to. Audio is working.

I have Windows 7 on the other partition mainly to play games.

There was a minor issue with Ubuntu trying to melt the CPU in my laptop the other day, but its not so bad since I upgraded, and I found this powertop thing that also helps.


i guess if you don't mind that osx used 5% active cpu just for flashing bubbly buttons that's alright.

i like osx, I think it does have a good ecosystem for GUI APPS. but at pretty much everything it fails. It's a performance nightmare and the filesystem makes me want to punch a kid in the face(yes sorry, I also don't think you should be doing opengl in javascript, but hey) everytime it kills the cpu.

Now, with all the mentioned above I do wish there was a better ecosystem for app development. I mean something like xcode 3 not 4. Yes we have QT, yes we have glade, but build an app with the interface designer and bindings, mvc concepts and it just helps a lot.

You can do most of it with Vala, granted, it's just shittier documented and not as "round", there are no standard concepts to follow, etc. And yes, I do like my linux customizability, but we have stuff like CERT best practices for secure C coding. Why can we not get something like that for linux gui programming.

ps. gnome3 can go right where it came from


In like 2006 I switched from Windows XP to Linux. This was before Ubuntu was what it is today. I learned using Slackware and eventually switched to Gentoo. It was cool and gave me nerd cred when I went to college.

I switched to OSX for exactly the reasons the author mentioned. The fact that I have an awesome UI + ability to use the shell all day is a huge win for me.


The machine that ran Windows XP can run Linux (http://wiki.debian.org/Hardware). So you can switch to Linux any time.

However, people cannot switch to OSX. The machine that ran Windows XP and Linux WILL NOT run OSX. So, no, you didn't switch to OSX, you purchased new hardware that is strictly controlled (motherboard, video card, etc).

Then, in an incredible blunder, de Icaza said, "Many hackers moved to OSX... working audio, working video drivers, codecs for watching movies..."

Uhhhh, yeah, if Linux gave up on supporting scores of hardware platforms and hundreds or thousands of hardware components, OF COURSE it would have working audio, video drivers, and codecs.


I installed this thing called "EEE Ubuntu" on my little EEE PC a few years ago. That "EEE" in the name meant, or so the website very strongly implied, "specially designed for the EEE PC". I think there were just 2 fundamental models of it at the time, both very similar in terms of underlying hardware.

Needless to say, the sound didn't work. And the wireless didn't work. When I clicked "suspend", it said it was out of swap space. When I closed the lid, it crashed.


I don't get all that OSX "awesome UI" everyone talks about. With my previous experience on Windows/Linux, everything seems wrong and backwards, and there is no way to configure or change it. Also the shell looks like a crippled version of unix.


As someone who uses the shell all day, I found that there is no alternative for portage and Konsole.

And since most things are web-based now, a vanilla KDE setup + Icon-Only Task Manager is awesome enough for me.


For what it's worth, GNU/Linux never really was about some desktop conquest, so this whole discussion "What killed the Linux desktop" is quite absurd.

That aside, what we have here is a thread apparently devoted to shitting on the work of people who built something for fun and gave it away for free.

Good job folks!


I would have to say Apple killed Linix. As many others have noted here, OSX has improved to the point where many Unix admins run OSX and it runs the tools they have for their work. Also Mac hardware is better than PC hardware so you buy a macbook with OSX and you are happy.


OSX didn't kill the Linux desktop, Office and Photoshop did. Just as it killed the *BSD desktops. Lack of high-end applications that were compatible with what the business world was using doomed anything that didn't have at least a tacit blessing from Adobe and Microsoft.


"As for myself, I had fallen in love with the iPhone, so using a Mac on a day-to-day basis was a must."

What? How? I've got an iPhone and have never felt like having a Mac was a must. Am I missing some major parts of the system that don't work if you don't have a Mac?


Does anyone feels that linux and desktop are at odds with each other? Don't we like small components to bind together using pipes ? Desktop apps are the reverse, big black boxes that barely communicate with anything (I'll admit I don't know dbus)


>"is not a sexy problem."

This pretty much describes the root cause of nearly all the impediments to the adoption of FOSS in general and GNU/Linux in particular by the general public. It touches everything from backwards compatibility to documentation.


Its strange for me to read something like this since I have recently switched back to Linux and I've never been happier with it. I've been a sometime user of it since around 1997 but it could never survive long as my primary OS. I bought my Frankenputer parts from Newegg without checking on hardware compatibility for any of it and it all worked great. I had only one problem which was wake from USB keyboard and I googled it down pretty quick (was a new issue with Ubuntu Precise it seems).

I like OS X too and had a Powerbook for years but all other things being equal I'd prefer to develop and deploy on same OS and Linux is just fine for development so far.


Way too true. I ended up going back to Windows, because the audio would frequently (3-4 times an hour) stop working on my laptop until I restarted pulseaudio. And that's on Ubuntu certified hardware...

Not to mention the problems we had with our streaming servers and ffmpeg. It turns out that there was a big flame war on libav vs ffmpeg, and someone from the libav camp managed to get the ffmpeg package marked as deprecated (it's not) and redirected to the libav package on Ubuntu's apt repo. So we're stuck either compiling from source or running our own repo. Seriously? (fwiw, the rationale is that libav pushes new versions more frequently)


Ingo Molnar has some interesting thoughts on this subject: https://plus.google.com/109922199462633401279/posts/HgdeFDfR...


I more or less disagree. My main frustration with Linux For Personal Use is that I can't buy a piece of hardware that I know won't regress with new versions of a distribution for three plus years or get any service if it does. My reference for the importance of this is a perfectly usable 2008 refurbished Macbook. I upgraded the RAM once recently for a bit more snap, but otherwise have no complaints over the three or so Macintosh releases since then.

Could the UIs and third party application situation be better? Of course. But considering all the garden variety crash bugs, power management bugs, lockup bugs, video driver misbehavior, hit and miss peripheral support, and in general just analysis paralysis about what hardware I should buy, and even then there is a less-certain future with regard to regressions.

Even given Windows's monopoly power in the commodity desktop and laptop markets, its reputation for dealing with sleep and drivers is only so-so compared to Apple Hardware and Software. If Window's monopoly power -- which buys you full attention from hardware manufacturers and their driver divisions -- only gives you mediocre results, what are the odds that a bunch of kernel hackers who receive almost no continual consideration from hardware vendors have a chance? To me, it looks like absolutely not a chance of becoming stable over time. I have completely given up on Linux laptops for this reason: by using desktops with Linux only I have cut out a lot of the problems, but not all of them. It's a kind of medicore that I can bear.

I want someone to sell me Linux distribution on a laptop that simply will not break over in its kernel-oriented features in five years of upgrades. I want that distribution to stop-ship if it a new version introduces a power management bug to an old laptop, and do whatever it takes to work around some lousy hardware bug or whatever. I want them to do whatever to work with Skype (such as statically linking whatever libraries, etc) and test Google Hangouts to make sure the webcam and microphone works. And it they don't work, they absolutely cannot ship. Until that day, I use Linux -- and I do mean the kernel in most of these cases -- as my personal operating system most of the time in spite of these problems because of my professional and philosophical needs, and not out of preference in any other dimension.


Is there anything like a hardware regression suite for testing new kernel features? or what about distribution features?

I don't even know what that would look like. Does anyone else?

Maybe a registered farm of devices that test distribution release candidates.


The first comment explodes this piece:

"I mean, look at OS X itself. Sure it's doing fine, but powered by iPhone and iPad, not by people wanting a new desktop. And it still has minority marketshare despite being from one of the most profitable companies on earth and despite Microsoft's repeated weird Windows-rethinks."

Basically, path-dependant lock-in means we're lucky not to be using x86-based wPhones that don't even have web browsers. The linux and open web communities have achieved amazing things, enabling Apple's comeback along the way.


Could it be that the main issue is the lack of leadership? We don't have many linux kernels yet we have dozens of incompatible desktop configurations and the list keeps growing. I think if there was a clear winner in the desktop wars, desktop apps would be of much higher quality.

And also the horrible aping of other environments and stupid UI eyecandy. Given that the majority of linux users and developers are technical, that's surprising.


You really have to wonder if the advent of Windows 8 and disgruntlement with it from Valve, Blizzard, etc. might have repercussions on this whole situation.


Linux isn't dead in the desktop because it never was a product in the first place.

The first attempts were Mandrake and Conectiva. Canonical has been doing a good job lately, the problem is that the platform is now beyond hope on the desktop, it simply doesn't gather traction from 3rd party developers - the most important thing for a desktop OS. You're pretty much limited to the FOSS utilities that exist on the repositories.


Seriously... Is this discussion still relevant?

Anyway, my bet on what "killed" the Linux desktop would be the Windows OEM licensing terms. Nothing really killed it because it was always a very specialized product.

Do we always have to see a problem when someone doesn't make the same choices we do?


Ask Mozilla how they manage to distribute their tarballs which work on all major distros.


"Miguel de Icaza — once a central figure in the development of the Linux desktop environment GNOME — says the open web is now a greater concern than free software."

I was kind of hoping those two things would each help drive the other forward.


OSX got nice touchpad, Windows has awesome game libs, and Linux comes with shit loads of developer goodness. But yea, now OSX has home-brew so it almost like a better linux, but still forces you to buy overpriced hardwares.


In a comment Miguel says,

Because the developers have moved on to greener pastures.

Of course, it all boils down to green at the end of the day.


Linux desktop will rise again as Android PC.


No Android please! Linux desktop is shifting from X.org to Wayland, and Wayland is making inroads into mobile as well. Android is really a completely side thing for conventional Linux.


I'm in the same boat as the author, I really have few complaints after moving full time to OS X from Linux.


The Linux desktop is not killed and will be more prosperous with Windows 8 and secureboot shit.


The concept of Linux on the desktop is as aberrant as the concept of Windows on the server


I hope we won't see a "what killed the android phone" post sometime in future :(


I dislike titles such as these that beg the question.


Since when was it dead?


Lack of killer apps?


I think he's right, but I think he's missing a key point.

Design. Design is what killed the linux desktop. It never had it. OS X has it. Even windows, crappy as it may be, has it.

Before I go on, let me say that Design is NOT "making it look pretty". In fact, thinking that this is what design is, is what leads many linux advocates to reject the needs of design.

Apple's work looks pretty-- because it is designed to function well.

Design is about usability and understanding the user and making an interface for the user that works well according to the users understanding, perspective and needs.

Design is an engineering discipline.

Seriously.

The Linux community hasn't had that, and I've seen many of them reject it. In fact, you can see it in the rejection of apple's patents. This is why they think that apple patents are not original is because they reject that any engineering went into them. But that's just one example. You see it all the time in lots of contexts. Look at the UIs of Linux... they didn't design one, they just copied windows.

Literal copying is about as far from design as you can get.

Sure, over the years, designers have taken cracks at bringing design to linux, including the work of Ubuntu, but it is rejected by the community.

Rejection of design is a cultural trait of the linux community. They reject it as a discipline, doesn't even see that it exists. (broadly speaking, of course.)

But as users, they have been influenced by it and many of them have switched to OS X because it is the best designed operating system.

And then they write long blog posts about how its wrong that OS X does things a certain way ... based on their lack of design perspective that would let them see why things should work that way.

Its ironic.

But its fine- if you want to run a linux desktop and don't value or care about design, more power to you. Won't ever fault someone for making that decision. We should all use the systems that we prefer.

But the culture that doesn't value design, and can't even see it as an engineering discipline, is going to have a great deal of trouble making something usable by the mainstream.


I'm sorry..I strongly disagree. You have me on the video support...OS X does a much better job there. You even have a point on audio drivers...they can be a pain at times. But design? Gnome 3 and Unity to me are FAR superior to OS X in terms of UI experience. YMMV of course. I bought a Macbook, gave it a fair shot, and it ended up being a dust collector simply because the desktop experience was painful compared to Ubuntu or Fedora (well, that and the fact that I got maybe three hours of the supposed 7 hour battery, and the fact that I can't upgrade it past 4 GB of ram o.O). Perhaps I'm set in my ways, but the absolute insanse window switching behavior, the piss poor excuse for a terminal, the fact that apps continue to run in the background after you close the last window, and the general SLOW performance made my $2500 investment one I'll regret for very long time. Switching to a Lenovo running Ubuntu was a far better (and MUCH cheaper) experience.


It sounds like you expected a clone of Windows and were predictably disappointed.

> I got maybe three hours of the supposed 7 hour battery

You should have taken it back. Seriously.

> Perhaps I'm set in my ways

You hit the nail on the head here. There's nothing wrong with that but most of your complaints amount to "this is not what I am used to".

> the absolute insanse window switching behavior

On Windows and Linux I cannot switch to an app bringing all windows to the front, I have to do it one at a time. Anyone used to one behaviour could claim the other is insane just because it is different.

> the piss poor excuse for a terminal

What on earth are you talking about? The Terminal app? Get iTerm2 or something else then. If you don't like gnome-terminal you don't whine about it, you just install urxvt or whatever.

> the fact that apps continue to run in the background after you close the last window

The fact that you have to have visible windows for an app to run on Windows and Linux could be perceived the same way. What if I wanted to switch back and use that app again?


On (...) Linux I cannot switch to an app bringing all windows to the front, I have to do it one at a time.

Sure you can. Just launch (or move) each application to its own virtual desktop. You can even group them logically, like an email reader with an IM client!

What if I wanted to switch back and use that app again?

Window and Linux applications are not disposable, they can be launched more than once.


> Window and Linux applications are not disposable, they can be launched more than once.

But when the last window is closed the application quits. What if I want to keep it running because I will switch back and open something in a minute? On Windows & Linux I have to launch the app again.

I'm not arguing for one or the other here, just illustrating that this argument is based on familiarity not one actually being better than the other. Each has pluses and minuses.


Much like the Mac OS menu bar; drove me nuts stuck at the top of the screen like that, but a lot of people like it and I can understand that...


That's my point. It's not that a lot of people like it. In fact, whether people like it or not isn't really relevant.

It's that this is the correct way to do it.

When the menu bar is at the top of the screen, the user is much faster at getting the mouse over a menu item and clicking to open the menu.

It works better. This isn't opinion, it's measurable scientifically.

When the menu is on a window, you overshoot with your mouse and come down and have to hunt to get on the menu to open it. This slows you down.

But the thing is-- people get used to being slower and less efficient and then when on a better solution, they think the better solution is worse because it goes against all the compensations they've had to build up over the years to get around the other, poorer, design.

Now, I'm not saying that you can't prefer a windows style UI and menubar, but I am saying it is most likely because this is what you've been taught to use, and thus it feels more natural to you, even though it is less efficient for you.


> It's that this is the correct way to do it.

Except when you have multiple monitors. How is it the correct way to do it to force me to mouse 4000 pixels to the left or right to get to the menu for an application that isn't on the same screen as the menu?

There's never just one correct way to do things. Your arrogance towards other implementations is overwhelming.


I'm sorry that you consider citing the science behind the decision "arrogance", but I suspect its just desperation to engage in ad hominem.


The arrogance comes in the way you deliver your argument and insist that your preference is the only valid choice, regardless of any difference in conditions; it has nothing to do with the science.

I took the same Human Factors class that cites the same decades-old studies that you're pushing as ultimate truth; all of that science was done when multi-monitor setups weren't even supported, let alone common. I have no problem agreeing that a universal menu is optimal for certain sets of conditions, but that doesn't mean it's the one true way for everything.


It is absolutely opinion.

For a start, it precludes a sane implementation of focus-follows-mouse.

Secondly, for frequently used commands, keyboard shortcuts will be faster anyway (I'm aware of the study you're probably going to mention, but plenty of other studies have had different conclusions).

Lastly, and this is pure conjecture on my part, wouldn't having the menu bar at the top of the screen result in a much narrower arc from the current mouse position to the desired entry? The theory also seems to ignore the fact that in most cases you'd want to move the mouse back to the original window, or hit some specific (and often quite small) target in that window. Which is now made more difficult by the mouse being farther away. Just some idle wonderings, they were quite possibly addressed in that study.

I know that you started your post with "Whether people like it or not isn't really relevant. It's that this is the correct way to do it.", and thus are unlikely to be swayed from your opinions; it's worth considering though that every choice has pros and cons, and sometimes the total value of each side very much depends on the person.

(although I share your "Screw the people, this is /correct/!" view when it comes to tau: no-one's perfect)


much narrower arc from the current mouse position to the desired entry

Particularly since the top left option (in the easy-to-hit corner) isn't usually one that you want to access much in the use of a program. Sure, you can get your mouse in the vicinity, but you still have to hunt for the menu you actually want. If your screen(s) is large enough that this requires moving your head (or even your eyes significantly), it rapidly eats up any trivial advantage you gain from 'knowing where the menu bar is'.


Science says that it's faster to hit a distant infinite target given sufficient mouse speed vs a small but local target.

given sufficient speed.

However it says nothing about whether an application's menu bar is important enough to warrant that extra acquisition speed.

Also the mac menu bar does not present an infinite target, particularly if you have any significant horizontal velocity, which is extremely common with wide screen monitors or multiple monitors.

Also with the advent of large/multiple monitors there is the need to change where you're looking causing extra delay for refocus.


It is not the correct way to do it. It has some very simple problems.

First, if I have a bunch of little windows on a big screen--something very common on Apple computers--I have to drag my mouse all the way to the top of the screen and then drag it back. For me, this was an serious problem. Partly this was because I was also using one of the horrible Apple mice (the "mighty mouse" or something of that nature). This was particularly annoying because I would often lose focus by clicking on something else by accident and have to go all the way back to my original window and then all the way back to the top... Incredibly annoying.

Second, not all programs have--or need--options on top. I essentially live out of Emacs and Chrome these days; neither uses a menu. That would make the Apple bar just a waste of screen space--completely absurd. Moreover, I've found that bar to be superfluous on other software I use as well. Having more minimalistic interfaces is a breath of fresh air--and completely impossible on OS X.

It also fails to generalize well to multiple screens.

Oh, and there are going to be problems if you want to make the available options context-dependent. This is a reasonable design decision if you have a complex program with a bunch of small, discrete windows for different tasks--something else that isn't uncommon on OS X. This sort of behavior is far clearer if there is a menu per window than if there is a single menu for whatever is currently active.


Measured scientifically just means it's better for the majority of users; it doesn't necessarily mean it's better for me as a single individual.

Not that I'm arguing against the bar on top; in fact, I'll use the keyboard shortcuts anyway, so I don't particularly care.


I submit that we are talking about a physiological issue, not a preference.

Attempting to hit a target in the middle of the screen is more difficult for humans than hitting a target at the very top of the screen.

When you overshoot the target in the middle of the screen, you have to come back around to click on it.

When you overshoot the target at the top of the screen, the pointer is stopped by the border of the display. Stopped right on the target, and you can just click.

It may not feel faster to you-- that's perception.

My point is that it is faster. And I haven't seen an argument for why the two things would be reversed for any specific person.

It's the nature of the latency between our eyes, mind and muscles.


There's more to accessing commands in a UI than physically manipulating an input device. I think you've got tunnel vision here.


So you're saying that humans have laser-like accuracy in the horizontal dimension, and sloppy accuracy in the vertical? Your interpretation of the science is poor.


I don't know how you get that interpretation from the parent post. Only having to worry about one degree of freedom in your movement is much easier / faster than having to manage two.


In the example, the user doesn't have to make a horizontal adjustment at all - the cursor conveniently stops right on target.


The OS X menu bar is modal. It depends on what application is current. There's also research indicating that the usability of modal UIs is poor; it's also measurable scientifically.

I personally don't mind the OS X menu bar's location, but its modality is a big pain point. Combined with the fact that apps often don't exit when all their windows are closed, it's a bit of a magic box of functionality where I'm never certain of what's going to be there, and how to get it back if it disappears. At least Windows' menu bars have a sense of location.

Don't mode me out, bro!


Some people don't use menus often. From my experience, I use menus 2-5 times a day at most. However, if you rely heavily on clicking menus, learning keyboard shortcuts can be a really good time-saver.

And the second argument against this "smart" (or maybe legacy?) menu placement: In some applications menus have sub-menus, and sub-sub-menus, etc. And it is impossible to put all sub-sub-...-menu items on the top menu bar, so you have to click many times in different parts of the screen anyway. (Oh, yes, this is not an issue for a program with simple menu, but then the keyboard shortcuts can be the way to go).

So, there are many shades of "correct", I think. Alternatives must be at least considered, I think that it's better to let the user decide what they prefer.


> On Windows and Linux I cannot switch to an app bringing all windows to the front, I have to do it one at a time.

I'm pretty sure Linux w/ Gnome 3 does it the Mac OS X way by default.


"Perhaps I'm set in my ways" -- I think this pretty much says it all.


what's wrong with the terminal?

hit cmd-q to close an app.

no idea what you mean by insane window switching behavior.

you gave no examples of how Ubuntu/Fedora is better to rebut on UX. care to share some?


What's wrong with the Terminal.app? Except for the fact, that it does not really work?

Yeah, for simple stdout/stderr output it is fine, but for medium-sized ncurses apps it is a major PITA. Even Icaza's own Midnight Commander does not work there.


No, seriously, what’s wrong with mc in Terminal.app? I mean, OS X and Terminal.app take over function keys for other purposes like, I don't know, changing the volume, but these things can be disabled if you want to use function keys for something else, like mc commands.


no idea what you are getting at.


Neither do I. Ncurses and mc work just fine in OS X Terminal.app. Even under 10.8.1


> Design is about usability and understanding the user

Here's the point you're missing. The overwhelming majority Linux users don't give a flying hoot about the average user or mass adoption. They just want a system that gets out of their way and let's them do their technical tasks. Linux succeeds and excels at this because it is precisely designed for that niche and curated by its end-users.

There are however people who do care about mass adoption, i.e. Canonical, "Linux Advocates". Generally they're either emotionally invested in Linux for some reason or they get a paycheck for their work. These people are the minority of Linux users.

Saying the culture doesn't care about design is absurd, it's the non-technical user we don't care about. Let's not conflate the two. Looking at a piece of software like XMonad or TMux it's hard to not appreciate the elegance in it's design. It's just different form of design that say iTunes or Photoshop.


> They just want a system that gets out of their way and let's them do their technical tasks.

That's precisely what Linux fails at. Most minor UI/desktop-related tasks that are hassle-free with Win7 or OSX take far too much time. Time that most people don't have and don't want to invest.

Example: doubleclicking on a .ttf file in the Xfce file manager (plain Debian Xfce) does nothing. Right-clicking offers various unrelated applications. Copying the file to an appropriate font directory does not "install" the font so it is visible to applications. No, I have to shut down all terminal windows and restart them to be able to use the font. Do you see an advantage in this? I don't want to spend time doing (or even researching) this when I could be developing instead (with the best console font known to mankind please!). Desktop Linux just lacks basic functionality and wastes people's time.

Sure, I was fine developing with fvwm2 or twm 15 years ago and did not mind using xfontsel and "xset fp rehash" back then, but today it feels like an unproductive waste of time when Win7 and OSX (esp. OSX) do not "get in my way" in any conceivable meaning of the phrase when I just want to develop stuff in an environment that I can adjust to my liking.

If you think Linux on the desktop hasn't failed for and abandoned by most technical users, you're delusional.


And for every anecdotal UI problem you bring up I could also come up with a corresponding anecdotal pain point with Cygwin on Windows or BSD/GPL header file craziness on OS X.

Personally manually rebuilding a font hash doesn't irk me so much as not having the ability to rip the guts out of my system and fix things when doing low-level system development. Messing with virtualization and vendor enforced settings seems much worse than Googling for a shell command or manually fixing a package.

> If you think Linux on the desktop hasn't failed for and abandoned by most technical users, you're delusional.

Yeah, I'm not delusional. I have a niche, Linux fills that niche. I have a lot of other colleagues who also fit in that niche. It's fine if you don't want to use Linux, I don't care in the slightest. But making broad sweeping statements about what "most technical users" do or don't need just makes you come off as uninformed.


Oh, for... Just drop the .ttf in your .fonts directory and be done with it. That you insist on mucking around in a GUI to do something as simple as placing a file in the spot where it belongs says everything you need to know about whether you belong in the "technical user who wants a system that gets out of the way" class or not.


There is no .fonts directory and this is a desktop machine - I want to install the font system-wide. Plus, the problem with half the running applications not seeing the font still exists.

So I'm not a technical user for you because I insist that such things should not require no more attention than a double click on the font file? Technical users aren't by definition masochists who take the longest route just because they can. I need to get stuff done - stuff other than installing fonts and learning the pecularities of a particular desktop Linux flavour!


Are you for real? In what universe is "mkdir ~/.fonts; mv .ttf ~/fonts" (or "sudo cp .ttf /usr/share/fonts" if you want them system wide) the "longest route"?

So fine, you didn't know this trick. No shame there. But you're willing to click around trying to discover the feature in your GUI (and expect the desktop to hold your hand trying to do it!) yet won't take 30 seconds to google for "install linux fonts". (I just did, by the way: the first two links tell you exactly what I just did).

So sure, the Linux desktop isn't for you. You want more polish and attention than it's willing to provide. Just don't pretend that your inability to learn a few facts about the implementation of your desktop and/or develop an intuition about how things might be done represents "hassle" that takes time away from your important work. Those skills are good to have, and those of us who have them are, quite frankly, better at our jobs than those who don't.


I get tired of repeating myself.

I found out what to do to install the fonts and it cost me precious time unnecessarily (~20 mins total until all issues were resolved - and by the way, Consolas with antialiasing still looks crap because Debian apparently ignores subpixel hinting [guessing]). That is the point.

Your proposed solution does not fix the (totally unnecessary) issue of running applications not seeing the new fonts. This is just broken.

My skills, memory, learning efforts are better spent developing stuff, not finding out what needs to be done in the current Linux FOTM's half-assed GUI to install fonts (KDE installs them on double-click e.g.). There is no "inability to learn" on my behalf involved, just unwillingness to spend time on things that can be and should be simple and straightforward and where the cumbersome Linux solutions are certainly no precious skill to have.

The rudeness and arrogance of you evangelists just drives away more users, by the way.


I'm a technical user who spends a lot of time using Linux, and it gets out of my way for doing development quite well.

Of course, that's because I exclusively connect to Linux machines through an ssh terminal on my Win7 workstation. I've personally abandoned all hope for using Linux as a desktop OS. I live in the terminal on Linux because that's where the design actually lives. Anything GUI related is an afterthought.


Double clicking a .ttf and then clicking the install button has been in Ubuntu's/Debian's default for years now.

I mean, you installed a stripped down desktop for some reason, of course it's going to be missing some features when you could have just used the default installation that has all of that already.


I actually ditched Linux as a primary OS as it didn't get out of my way, I had to spend way too much time burying into the system to get the results I needed. That can in a few cases be attributed to poor design, the thought process seems often to be "I get this piece of design fine, everyone else will".


Agreed. I do a lot of scientific computing, and many of my tools simply don't exist in OSX compatible builds. While the majority of the work that I do is remote, on servers running Linux, tools like Xmonad and aptitude are basically indispensable for me.


Even windows, crappy as it may be, has it. [...] Apple's work looks pretty-- because it is designed to function well.

Actually, I'd say the same of Windows, if not more so. I far prefer the Windows 7 interface to OSX, it's just stuff like the terminal/command prompt that really lets Windows down.


I absolutely agree. I actually prefer the graphical shell of Windows 7 to the one of OSX Lion. But Windows does not have a Unix terminal.

And I think that is the main point here: Linux might lack design graphically, but the Linux command line is about the most beautiful thing out there. There is an abundance of elegance there is probably only rivaled by some Lisp machine.

Who knows, maybe, I will switch from OSX to Linux because of that at some point. It is certainly tempting.


> I actually prefer the graphical shell of Windows 7 to the one of OSX Lion.

Explorer? Versus finder? That's a pretty low-hanging fruit, I can't remember ever preferring finder to explorer and I've been using OSX as my main dev machine (and Windows mostly for the game box) for 6 years now. I prefer OSX overall, but the finder still isn't a good shell.


I was going to leave a snarky comment with just a list of awful OSX applications here. But then I looked at a list of the applications I have installed and... Well, while many of them might be bad, their usual alternatives on Windows aren't exactly better.

But Finder really is bad. Then again, I wish ANY other OS would implement the column view Finder has. As bad as Finder is, column view is awesome.

But many of the great things on the Mac have been subverted in the last few years.

Have you tried dragging a proxy icon from the title bar of an OSX application to an external disk lately? It creates a new alias there. An ALIAS. On an external disk.

Or HFS+, which still has a global lock whenever ANY application wants to write to the file system. A GLOBAL lock. No other program may access the hard drive while one is writing. Insanity!.

Or the complete lack of uninstallers. I am continuously baffled by that. How can you have a "modern" operating system without uninstallers?

Desktop apps are really fine though. It's just the foundation that is getting rusty. Not a good thing, that.


Or the complete lack of uninstallers. I am continuously baffled by that. How can you have a "modern" operating system without uninstallers?

Well, the dream is that you don't need one. Drag the app from Applications to the Trash and... ta-da! Of course, that never happens with anything more complex than the most superficial of apps.


> Of course, that never happens with anything more complex than the most superficial of apps.

Well all of my applications are superficial apparently, because apart from Little Snitch I can't think of one I'll need an uninstaller for. And LS is a special case.


I'd say at least 25% of the OS X apps I try have an installer and therefore require an uninstaller that often doesn't exist. Usually someone on a forum somewhere lists a sequence of terminal commands required to remove an app and it's background services. If I'm lucky.

Happened to me several times in the last two weeks alone. I vaguely remember it took me ages to get VMWare or similar off my machine.


Then again, I wish ANY other OS would implement the column view Finder has. As bad as Finder is, column view is awesome.

Hmm, Wikipedia lists a bunch of file managers with column view support[1]. Here's it in Konqueror: http://www.konqueror.org/pics/konqueror-file-manager-col-vie...

[1]: https://en.wikipedia.org/wiki/Comparison_of_file_managers#Ma...


>Or HFS+, which still has a global lock whenever ANY application wants to write to the file system. A GLOBAL lock. No other program may access the hard drive while one is writing. Insanity!.

Could that be the reason I had to disable Spotlight's indexer to prevent bouts of extreme unresponsiveness on my 2011 Mac mini?


>I wish ANY other OS would implement the column view Finder has. As bad as Finder is, column view is awesome.

Sometimes referred to as "Miller columns" BTW.


There are nice terminals for Windows and you can install gnuwin32 & win-bash to get almost complete UNIX command-line experience without Cygwin burden.


I heard Windows PowerShell was decent. No first hand experience though


The thing about the Unix shell is that not only is it extremely power and flexible, it is designed to work with pure ascii so it can used on many remote, bare bones machines and it really is there running on many remote, bare bones machines. And given all this there are many people with strong skills in that shell and a strong attachment to that shell (for rational and irrational reasons imho). With all this, it would be hard to replace this shell even with something that's better. That actually makes me sad since actually dislike using it since for all its power, much of it is annoying and bass-ackwards imho again.


It's unbelievably verbose and the documentation sucks hard. Also, working on remote machines is unreasonably annoying. It is decent at manipulating structured data though.


But it is not Unix.


If Unix is what you want, you shouldn't be using a system that isn't Unix (or Unix-like in the case of Linux).

However, there's still Cygwin which is reasonably good.


Except that it is dog slow, horribly out of date, neither really compatible with Windows nor Unix paths, and lacking support for graphical stuff.

Well, I use msys and eshell instead, which I find less jarring.


>I far prefer the Windows 7 interface to OSX, it's just stuff like the terminal/command prompt that really lets Windows down.

I'm not sure. Windows has some overall design, but also has tons of design neglect.

One example: due to static compiling and linking, you often get programs (and lots of them) that still use Windows 98 like Open/Save File dialogs, instead of every program automatically getting the latest OS standard widget with late binding.

Or things like the Metro/Windows dichotomy in the latest version.

Windows always feels design by committee to me, and one where some manager can have idiotic ideas and no developer/designer can shoot it down.


That example isn't due to static linking, it's because they chose a particular style of dialog which is deprecated and cannot be updated without breaking backwards compatibility.

Software compiled in 1998 has no problem using the newest common dialogs if they used the standard API without installing their own hooks and customisation.


>That example isn't due to static linking, it's because they chose a particular style of dialog which is deprecated and cannot be updated without breaking backwards compatibility.

I'm not sure what this means though:

"it's because they chose a particular style of dialog which is deprecated and cannot be updated without breaking backwards compatibility."

So you mean something like the Windows API featured 2-3 different dialog styles and they set some option / flag in their code to use one that was subsequently not updated by MS?

Could be. But at least the dialog style could change, it seems in Windows API all drawing is hardcoded, so the dialog also comes with the old style buttons etc.

MS should have continue to offer the dialog styles they once had as options instead of deprecating them. But even failing that, they should at least made it so that their graphical elements (sub-widgets like buttons, text fields etc) are kept updated. That doesn't involve changing the API or binary compatibility will the calling program.

Also, seeing those dialogs in old shareware/commercial programs when run on modern Windows is one thing. But one also sees them in some very-rarely-updated MS applications that MS ships with Windows. At least you did, until Vista (haven't used Windows since).

Another infuriating thing from Windows dialogs of old --that still appear in apps: when the size of text-entries, icon fields etc is too small, but the dialog is not resizable.


"it seems in Windows API all drawing is hardcoded, so the dialog also comes with the old style buttons etc. "

IIRC, the particular style allows applications to paint on top of the system dialogs, add controls, etc. The API call cannot know when or where drawing will take place (or whether it will happen at all), so it cannot use any other screen real estate than what it always used. That means controls must stay the same size and exact location. That rules out shadows, font changes, etc, so the control must forever stay as it was in the dark ages.

On the positive side: I understand that the Windows 3.1 "Add Font" dialog is gone in Windows 7. Unfortunately, not everybody seems to be happy about it: http://www.sevenforums.com/backup-restore/42035-restore-inst...


Design is important, really. However I find it disturbing that such a self-righteous rant has crawled up the rankings in this thread...

Design is an engineering discipline.

Design isn't engineering discipline. It is certainly a discipline and deserves respect but the reason software engineers fail at design so often is that it is different discipline with a different mindset.

Apple's work looks pretty-- because it is designed to function well.

Here I simply disagree. Apple's look pretty because they are designed well. I don't find that they function well in terms of usability - that I have to use the mouse for nearly everything is a usability nightmare. The way they've hidden file system is also a usability nightmare. But enough prettiness and responsiveness, Apple can still feel enjoyable to use without being more useful or even as useful as Linux or Windows.

In fact, you can see it in the rejection of apple's patents. This is why they think that apple patents are not original is because they reject that any engineering went into them. But that's just one example.

Wow! Holy leaps of logic, here we've got the disingenuous conflation part of the rant. For the sake of argument, the particular way iOS or OSX is designed might indeed be a work of genius but the patents Apple is hurling Samsung are not for that unique approach but rather are a disgusting grab of the right for anything vaguely like iOS or OSX, which would include really badly designed things.

Jeesh...


So I enjoyed using Ubuntu w. Gnome2 more than any other OS/window manager that I've used, which includes Win 98-7, and Snow Leopard->ML.

I thought it was designed well. So what does that make me then?

"if you want to run a linux desktop and don't value or care about design"

But I do care about design, it just sounds like you and I like different kinds of design.


You're conflating design with the marketing and support that convinces people to open themselves to learning a new system design.

Linux copied the Windows interaction paradigm (and everything else too - fvwm95/afterstep/olvwm/cde/etc) because the goal wasn't to create a new method of interaction and suffer from two adoption battles, but to appease people complaining about how Linux wasn't intuitive (after they'd already gone through one UI learning curve). Ubuntu will never ship with a default tiling window manager (certainly not the "best" design, but it's a novel design).

> based on their lack of design perspective that would let them see why things should work that way.

If a user desires for their operating environment to function in some way, they should be able to develop that customization, full stop. I use 24 virtual desktops with windows generally maximized, switching between them with [Alt+]F1-F12. Is this good design, especially for the average user? Hell no. But it fits my desires and is quick enough that I can't understand why people desire multiple monitors. The idea of someone saying my setup is 'wrong' from a design perspective is laughable, and only serves to illustrate the bounds of the field.

> This is why they think that apple patents are not original is because they reject that any engineering went into them.

Sigh, back to this again? Nope, what's being rejected is the idea that engineering an implementation of a concept should get you the rights to the entire concept through a "concept plus computer" patent. Where's the novel signal processing algorithms to clean up capacitive touchscreen input? Instead, we get descriptions of straightforward processes in terms of vague programming idioms.


>You're conflating design with the marketing and support

No, I'm talking about knowing that users are faster and more efficient when the menu bar is at the top of the screen rather than attached to windows. Design makes people more efficient. Undesigned things, or poorly designed things don't care about these factors, or do them half assed.

> The idea of someone saying my setup is 'wrong' from a design perspective is laughable,

This is because you reject design. You claim it is marketing. You reject it as an engineering discipline. That's my point.

>Where's the novel signal processing algorithms to clean up capacitive touchscreen input?

Patented, and you guys claim that the patent is disproven by the existence of "touch screens" in the past that worked by pushing one layer in enough that two wires touched closing a circuit.

> should get you the rights to the entire concept

The only people claiming that the rights to an entire concept are involved are those who are trying to claim that these are bogus patents... which is either profoundly ignorant or completely dishonest.


Firstly, just no, stop arguing against a straw man. I'll agree that putting window toolbars at the top of the screen seems more efficient. I honestly don't have much experience with OSX, but given what people say, I'm willing to assume that it has examined these details and come up with a nicer all-around experience.

However, the nicest experience in the world isn't sufficient for success. Something has to make people get over the activation energy and earnestly investigate something other than what they know without complaining what's different and lacking - generally social factors.

> This is because you reject design

No, it's because I acknowledge that design happens within a context - most design is not universal. What is an appropriate UI for a CAD program is not appropriate for an airport kiosk, and vice-versa. When a user wishes for customization, they are designing within their own context. Their judgment is small in scope and may end up hurting other aspects of their experience, but it's ridiculous to say that the opinions of a blessed Designer aiming for the 'common user' are more valid than their own opinions of what they desire.

I had skimmed the tap to zoom patent and all of the claims seemed quite generic. I'll have to examine it in more detail, but I have to ask - if the patent does contain a novel technique for using a capacitive touchscreen as input to a UI, why does it include the specificity of zooming, a quite necessary and obvious activity for most UIs ?


Linux never got usable enough for design to matter.

The process of getting linux to boot, and then run usably -- with wireless networking and sound -- was too hard. If you got as far as being able to actually get work done, you claimed victory and used your machine as a continuing monument to your success.

In 2001, when I switched from Linux to OS X, the mere fact that I could go buy a machine, turn it on, open a terminal, and type 'ls' was enough to make me switch. It still took hours to get a portable machine working reliably with linux -- and it took 2 machines if you wanted wifi to work at the time. One to get wifi working, and another to look up the documentation and download whatever you needed.

Even as I'm back on Linux, I don't think of buying anything other than a ThinkPad, for the fear of returning to that worthlessly hellish place.


This is pretty much spot on.

I believe that the lack of "strong-D" Design you refer to is because Linux's desktop is pretty much the ultimate committee-driven project. It's very democratic: great for preventing abuse, but rather bad when it comes to vision.

Most of us here know how difficult it is for one person in a company to drive an idea through without it being diluted; countless product companies have gone bust over the years because of committee-induced paralysis. It's much worse when you have a distributed group of volunteers who rarely, if ever, get to meet.

Maybe collaborative development is incapable of producing something as complex and difficult as a clean, well-designed desktop environment and API? Forking, competing spinoffs, factionalism are all useful things in some ways, but they're death to this kind of project.


Maybe collaborative development is incapable of producing something as complex and difficult as a clean, well-designed desktop environment and API? Forking, competing spinoffs, factionalism are all useful things in some ways, but they're death to this kind of project.

The Rails community seems to manage.


Well, Rails is a web framework. It's not an end user desktop environment like Gnome, KDE, NeXTStep, Windows or what have you.

Collaborative development does rather well with developer tools.


Sure... so why the culture difference?


Because developers understand what they want and need from their own tools.

Now, they might have different wants, and disagree on how to meet their needs, and march off in 800 different directions, but ultimately with enough effort any one group is capable of meeting their own needs.

UI/UX that works well for "normal people" is something that most developers don't understand; they know the UI/UX that meets their needs, but developer needs are quite rarified and abnormal compared to the typical user.

Thus, when they respond to disagreement with a UI/UX engineer by marching off in 800 different directions to scratch their own itches, they end up with no solution that meets the "normal user"'s needs. There's no culture of shutting up and following orders for the greater good of the project.

tl;dr you can't solve the problems of someone different from yourself if the only thing you do is scratch your own itches.


Nicely put. Seems reasonable.


Rails has a pretty strong leader in DHH.


But there are strong personalities in the Linux community as well.


I would bet if you ask people what they don't like in their linux setup design would come last. The main issues i hear seem to be compatibility and missing obvious functionality (oh clipboard). In fact , most desktops are so configurable you can actually make them look great.

If you want to make a system popular, start with the people who are actually using it. Linux is used by technical people, so why are they putting us off with silly animations and glossy layers that will break in the next upgrade? OSX is not easier to use either, it has many nonintuitive quirks but they stick to their design and improve it instead of overhauling it.


Right, because the people who use linux don't understand what design is.

Look at the responses to my comments-- almost all of them defending linux make comments that show they don't understand what design is. They think it is how it looks.

>OSX is not easier to use either, it has many nonintuitive quirks but they stick to their design and improve it instead of overhauling it.

I suspect you simply haven't used OS X much. It was designed correctly and is intuitive. The reason you think its "unintuitive" is likely you've been trained to use a broken system that was not designed correctly and thus what feels "natural" to you-- which is actually learned behavior-- feels wrong on OS X.

Meanwhile, Apple has continued to improve their design, and in the rare case where they got something wrong, they fix it -- for instance, they reversed the direction of scrolling to make it more intuitive... move the finger up on the trackpad and the page moves up on the screen.

They don't need to overhaul it because they have been improving it for 3 decades....

They've done such a good job at it, that you think they havent' done anything!


I said there are some quirks , OSX is very usable in general although first time users do have to learn the ropes. For example, maybe everyone likes the fact that the maximize window does different things depending on the app or hate using keyboard shortcuts but i don't. I didn't mean to be anyone's advocate, as you seem to do.


"For example, maybe everyone likes the fact that the maximize window does different things depending on the app"

It's not a maximize button, it's a zoom-to-fit button. If you want maximize, there's full-screen mode.

"hate using keyboard shortcuts"

Macs have lots of keyboard shortcuts.


At the risk of re-igniting that flame war, I posit that this is also a cathedral vs. bazaar issue. OS X is more or less centrally planned, by one company, Apple.


I would think so, but OS X has had several great or important applications within it's ecosystem, that break UI conventions.

Twitter clients used to be a playground for new UI concepts, none of which came from Apple. The Sparrow mail client was a good example of something probably more influenced by Tweetie than anything else.

Even Apple doesn't have a ton of consistency. iTunes continues to baffle me with random changes things like what resize does. The App Store app shifted the window management buttons.

So I don't buy that the OS X ecosystem is a "cathedral" at all, just a bazaar with better taste.

What's interesting is that I noticed the standards of design for Mac apps were significantly better than Windows apps when I "made the switch" several years ago; and these are for apps that were not coming out of Apple. I could never entirely figure out why; the ecosystem was just more attuned to details.

I posit that Apple was helped by having the right people excited long enough to get a great ecosystem going.

Linux didn't die because it's a bazaar; every modern app ecosystem (OS X, iOS, Android, the web) functions like a bazaar. Linux just made it absolutely painful to set up shop.


There is nothing stopping people sourcing the parts from the bazaar and using them to build a cathedral.

After all it's exactly what NeXT (and Apple) did.


Yeah, but you need an architect that people will listen to in order to build a cathedral. It has to be many people, united under a common vision. That's really hard to pull off in the Linux community.


This is a reason for poor design, not an excuse


Maturity is the driving factor. Linux has a lot of immature attempts at a desktop. OS X has one but not a lot of time or manpower spent perfecting it (XCode sucks, Core APIs are weak/incomplete, OS support lacking for fundamentals).

Windows is nothing but mature. Billions of hours running, automated bug reporting, millions of developers, nth generation tools.


> Windows is nothing but mature. Billions of hours running, automated bug reporting, millions of developers, nth generation tools.

And yet it still BSODs on me too often to be actually usable.


Bad hardware/drivers are more likely to be the culprit...


When something doesn't work or works badly on Linux, the critics always say it's Linux fault (just read the comments here). Hold Windows and OS X to the same standard and you'll see they aren't so wonderful in comparison.


Sorry to hear about that.

Windows isn't a closed hardware platform like OS X, so driver support can be difficult (Windows update helps a lot on that).

Also, are you overclocking? Using a 4-year-old motherboard? There has to be some reason for BSOD - it's not a normal condition. Not making excuses, hundreds of millions of machines run for years without doing it.


> Windows isn't a closed hardware platform like OS X

I use Linux.


In the early days, say, Windows 95, it did BSOD a lot for seemingly random reasons.

Nowadays, with the experience of handling dozens of Windows machines both server and client, I can tell you with a high degree of confidence that BSODs happen only when hardware is dying or a driver is buggy. The flavors I can confirm this for go from XP SP3 to Server 2008, passing through Server 2003, Vista and 7.


I completely endorse this, and wish I could hit the "up" arrow hundreds of times over. It's also refreshing to see a comment like this that isn't immediately shot down. It's hard to overstate the importance of actually having things written by someone with at least a sense of what it takes to make an intuitive and pleasant-to-use interface.

The problem that linux seems to have, is that everyone wants everything and so everything gets thrown in with configuration options to change absolutely everything.

I don't think, however, that lack of good design - or even seeing the need for it, is isolated to linux - in a very real sense, I think most of the population is usually oblivious to it - they just know whether something is easy to use or not.


This mythical user must also be the bottom of the barrel. I don't think the Apple designers ever consulted me when they designed OS X. Who were they designing for? Everyone? How can they possibly conceive of what every user wants but me?

This "design," argument is a floppy, magical, hand-waving fish. I know what design is. I design software all the damn time. I am sensitive to interface and expectation. I want as little friction between my tool and the thing I am creating.

In that regard, as a software developer, OS X is a piss-poor experience. Using emacs on the thing is ripping years out of the cartilage in my fingers due to the masochistic meta-key layout that I've had to accustom myself to. It's terrible.

The eco-system of package management systems on OS X is infuriatingly shoddy. Between macports, homebrew, and fink -- none of them can seem to get it quite right. They have to compete with the user-land eco-system that Apple ships. Half the packages are broken. And I can never bloody well find any configuration files anywhere once it has installed something. It's ludicrous how much time I've wasted trying to figure out how in the seven hells you're supposed to get MySQL or Postgres to run on the damn thing (there are half a dozen ways at least by my reckoning and google's).

Their /design/ isn't the best design there is. They are designing for someone, but it surely isn't me. If you happen to fall into their mold then I'm sure you're as happy as a clam but don't think the entire world of people are just like you.


Thank you. All too-often that entire line of thinking was minimized into simply "caring about eye-candy" and summarily dismissed. That irked me. It was like with the Linux desktop, you had to choose a side, instead of simply marrying what is great about GNU/Linux with what is great about a cohesive modern design and user experience.


The last time Linux on the desktop looked and felt good was when Enlightenment DR16 came out. When I installed Ubuntu last year I felt like I was using something for children.


> Apple's work looks pretty

To you. Not to everyone.

> Look at the UIs of Linux... they didn't design one, they just copied windows.

Yes, because ratpoison looks just like Windows 95. Just like.


>I think he's right, but I think he's missing a key point. Design. Design is what killed the linux desktop. It never had it. OS X has it. Even windows, crappy as it may be, has it.

I'm not sure he is missing it, even if he doesn't name it explicitly on the post.

For one, he was always pro good design (and trying to get the good design parts of other systems), but also, when he laments the anarchy and lack of long term planning on the Linux side, he is essentially lamenting the lack of a central design authority.


> when he laments the anarchy and lack of long term planning on the Linux side

Maybe that is what really makes Linux struggle as a desktop. Fragmentation. It really sucks to have to install hundreds of libraries to run an app because you run Gnome but it was made for KDE. It's always unconfortable to change environments because it's not just different looks, it is different ways of doing things.

Some users and hackers do like choice, but most common users don't, because it is reduces the number of things to learn. You don't have to copy MS Win or OS X to be good, but consistency is good, and unless you select a subset of Linux desktop (and its related apps) you will be missing a more consistent user experience.


Apple builds fancy gadgets and gathers a fan-boy population, and eventually starts selling more. This really doesn't say anything about Linux desktop.

This whole thing about backward compatibility and the discussion that surrounds it is just vague. Here's a practical "true story" for you: I'm using GNU/Linux for more than 10 years now, and it is still alive.

Never had any vague binary compatibility problems either, because I'm not strangely expecting to use an ancient binary version of Gimp on my current system. That's because FOSS is source oriented, not binary. I'm not suddenly trying to use a 15 years old graphics card whose driver is longer in the kernel either, because I don't use a 15 years old graphics card.


well said; the linux user is mostly a diy developer like me; I enjoy the fact that what I am running on my laptop runs on my servers and everywhere.. Windows can't do this, (So I was using windows server on desktop), OS/X is dead on server.. Linux is pretty much the only sane option for independent developers.


I'd say it was the lack of a standardized install convention for "guest" (non-nix) software.

What to do about it? Couple golang's preference for large statically-linked binaries with a one-folder, one-executable install convention and Linux may become more inviting for non-nix apps.

For example, imagine "/outside/myapp/myapp" is a large, unix-unfriendly, statically-compiled binary placed in it's folder by a OS-provided install utility. "Myapp" was probably developed for Mac or Windows and by design does not give a damn about /etc, /lib, /var, etc. These app should just be allowed to crap their configuration files into the home directory into which it has been placed ("/outside/myapp"). If one no longer needs the app, the folder is deleted along with everything else the app created while it was being used. Tidy. Behind the scenes such an app would be compiled to call the standard Linux APIs, yet it would probably avoid any dynamic dependencies. Disk space is cheap. Just bundle it all together and throw it somewhere where it can run in peace.

Amiga's icon files are another approach. Rather than a large, monolithic registry tracking everything in the system, executables exist in tandem with an "icon" (.info) file. This file is generated by the OS and tracks the executable's location and other settings in the workbench (desktop). A modern reincarnation could potentially track anything. Instead of accumulating registry filth with every uninstall one can simply remove an executable and its associated .info file. Instead of adhering to the heir convention, the app plays nicely in its own folder with it's own registry. By using an ".info" file, portable non-nix installs could reside anywhere, and not in a prefabbed "/outside" folder.

The smartphone penchant for portable installation should come to nix, particularly with non-unix software. It should be encouraged, and that's coming from an OpenBSD user. Unix needs a playground for non-unix apps.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: