> The second dimension to the problem is that no two Linux distributions agreed on which core components the system should use.
Linux on the desktop suffered from a lack of coherent, strategic vision, consistency and philosophy. Every engineer I know likes to do things a particular way. They also have a distorted view on the level of customization that people want and need.
I like OSX. Out of the box it's fine. That's what I want. I don't want to dick around with Windows managers or the like. Some do and that's fine but almost no one really does.
Whereas Windows and OSX can (and do) dictate a topdown vision for the desktop experience, Linux can't do this. Or maybe there's been no one with the drive, conviction and gravitas to pull it off? Who knows? Whatever the case, this really matters for a desktop experience.
I have two monitors on my Linux desktop. A month ago full screen on video stopped working. Or I guess I should say it moved to the center of the two screens so is unusable. I have no idea why. It could be an update gone awry. It could be corp-specific modifications. It could be anything. But the point is: I don't care what the problem is, I just want it to work. In this regard, both Windows and OSX just work. In many others too.
I can't describe to you how much torture it always seems to be to get anything desktop-related to work on Linux. I loathe it with a passion. I've long since given up any idea that Linux will ever get anywhere on the desktop. It won't. That takes a topdown approach, the kind that anarchies can't solve.
Just because it's never happened to you on OS X or Windows doesn't mean it doesn't happen. OS X 10.6.7 broke the output on my 13" Macbook for either of the two external displays I own. Both worked fine previously, when booted from the install CDs, or from Linux on the same machine.
Plugging in my Firewire audio interface on the same machine spun the CPU up to 100% and kept it pegged there. A lot of good having a nice mic pre-amp does when you get a high pitched fan whir added gratis to all of your recordings.
It's silly to pretend that Mac is somehow perfect in these matters. In my experience it's only been marginally better than Linux, if at all. And with Linux you have some hope of finding a solution, whereas for OS X you're pretty much hosed.
Straw man. Nobody said OS X was perfect.
> In my experience it's only been marginally better than Linux, if at all.
I used Linux on my desktop for several years and have now used Macs as well for several years. I won't say this is bullshit because I don't think you are lying about your experience, but I think you are extrapolating way too far.
> And with Linux you have some hope of finding a solution, whereas for OS X you're pretty much hosed.
Forums and mailing lists are "hope for a solution" while the genius bar is "pretty much hosed"? How on earth did you arrive at this conclusion? That just doesn't seem reasonable.
As for fixing the issue, both of my issues were kernel issues, as are most hardware compatibility issues. On OS X reverting to a previously working kernel would have meant backing up all of my data, reinstalling the OS from the DVDs, installing the combo update to the last working version of the complete OS (10.6.6), restoring my data (but not using Time Machine) and then never allowing another OS upgrade, including not updating the apps that come with the OS.
On Linux it would have been a matter of downloading the previous kernel package and installing it. That's it. The package managers are usually smart enough to figure out that you've forced a previous version of the package and won't replace it in future updates.
Neither solution would have been doable for a novice, but the Linux solution is far easier for an expert and a generally more palatable solution. And note, this was Apple's OS breaking Apple's hardware.
That said, maybe Linux in it's current stage isn't destined for the pickup approach of Windows or OS X. I believe that Linux is not following the same path as Windows or OS X, and because of that it may not end up at the same destination.
I agree with the original article; Linux is produced differently, often with different goals than Windows or OS X. Trying to shoehorn it into the same path as Windows or OS X may not prove fruitful till we see someone with the drive to change everything onto the design path that Windows and OS X are taking.
Step 1: have you tried rebooting the machine?
Step 2: have you tried reinstalling Windows?
I thought those days were over.
You also said "even compared to Linux". That's interesting. That statement there gives the impression that you're more interested in defending your own beliefs and/or choices rather than being truly interested in explaining to us what OS has objectively terrible hardware support. "even compared to Linux" sounds like something an apologist would say. Then when you added in the "think different" line you made it seem even more like your comment was based off some kind of blind loyalty to Linux rather than loyalty to facts.
Me? I've used Mac, a handful of Linux distros, and Windows for a long time. I don't know which has the best or worse hardware support but I do know when someone says something based on what camp they're in rather than what the facts are.
Don't read too far into things, and "...loyalty to facts" eye roll.
Anyway, It's pretty common knowledge to Linux users that hardware/driver support on Linux can be a challenge. Sometimes things work right out of the box, other times you have to do a lot of work and a lot of Googling.
That being said, I don't agree with the parent. For basic hardware needs, Windows and OSX have a fairly high success rate of plug-and-play functionality. For me on Linux, it's about 50/50.
In Linux you very rarely have to install any driver. True, some hardware remains unsupported, but the list of compatible hardware without any installation required is pretty long. On windows, you'll need to install stuff for most of new hardware you try to plug in, no matter what.
Anyway, this has been almost 100% true up from 10.3 through 10.6.x. Recent developments both on Apple's as well as on MS's side have closed the gap between the two systems quite a bit. I still prefer OSX but it's gotten close. The real reason I continue to buy macs nowadays is the hardware quality.
This. It's almost as if Linux is a collection of parts and tools from which a sufficiently good designer can build a usable desktop whereas OS X is a usable desktop. That's an exaggeration, but for me there is some truth to it.
Note that Linux was my only desktop from 1997 through 2009.
This has been highly successful, only I'm afraid that the Unix people inside Apple are loosing influence. The sandboxing in ML looks like a mess to me. Not that its a bad idea per se, but look what it did to our beautiful '~/Library/[ApplicationSupport|Preferences]'! It's tacked on and it's obvious that this has been a political decision from management rather than engineering. Makes me angry.
On a different note: I've never really used it extensively because I switched to macs in '05, but wasn't Ubuntu pretty close before they switched to Unity? I remember last time I installed it that for the first time I could have a Linux that worked out of the box, including network and graphics drivers.
Which is fine, it's Apple's decision to take, but it's no longer for me.
(As for Ubuntu - I'm using pretty standard hardware and never had a problem)
I'd say the hardware support is fine for what little customisability Apple allows on the hardware that they sell. Perhaps the Mac Pro might have some issues with internal cards (I honestly wouldn't know ...), but those devices are not affordable for 99% of the people.
Low end printers and scanners in particular are notorious for skimping on the logic in the device itself and depending on a large blob of Windows driver code, with manufacturers that couldn't care less about supporting anything other than Windows.
In my experience only Windows 7 has passable MTP support. On Linux I had mixed results with gvfs and mtpfs (slow and crashy). jmtpfs  is a nice replacement, though. Google also released an application for OS X  but I can't try it since I don't own an Apple computer.
My solution to the problem was to install a WebDAV server (such as ) on my device (Galaxy Nexus) and access it wirelessly.
Support for the USB Mass Storage protocol has been deprecated in Android in favor of MTP (Media Transfer Protocol) starting with Ice Cream Sandwich.
The experience in Windows and on Linux is just as poor with this phone. (I love the phone, though.)
This is 100% to blame on Samsung, sorry.
Even though my Samsung phone is seen as a USB device (and it works on MAC OS X), but maybe in the newer models they removed this functionality (and called it a feature)
People may complain that iPhones need iTunes but then again it's iTunes not the gigantic pile of crap that is Kies
It was pure engineering tradeoff to improve things in the long term. If anything OEMs would have preferred the old 'just works', sub-optimal, gives-an-excuse-to-obsolete-a-phone, lower support issues, solution. The ball is now in operating systems' court to support this standard in a way it wasn't envisioned to work a few years ago.
Even ubuntu has been slow on this front.
OS X has perfectly find support for media devices, and has worked fine with every digital camera, SD Card or other card reader, etc, that I've attached via USB in the past decade!
Which is the right move in the long term, although it's certainly causing some pain in the short term.
I have seen problems with third party hardware that required its own driver (poorly written generally) or that was half assed crap that was designed to work with windows but marketed as supporting standards like "USB".
Meanwhile, Linux has trouble dealing with the computer itself, let alone getting hardware attached to it. Constant pain in the ass.
Currently I'm in a country in a different hemisphere from the USA. I walked into a store and bought and off the shelf no-name brand displayport to HDMI adapter. Worked no problem.
I've gotten used to being able to do that... and haven't had much trouble getting things to work with OS X in years.
Course, I also stopped buying things that require custom drivers-- that's a clear sign you're not going to have a good time.
To claim that Linux is better in this regard is, quite frankly, asinine.
Your comment looks about as asinine to me as the as the parent comment does to you.
That really has nothing to do with the OS you're running. The OS isn't driving the DisplayPort<->HDMI protocol conversion in any way (your 'no-name' brand adapter is probably relying on DP++ and just passing through the HDMI signal anyway).
This is like saying your VGA cable is compatible with Macs with a VGA port. Of course it is.
The problem is that hardware often doesn't really conform to standards, even those it claims to.
One of the reasons device support in Linux (and other OSes which target a wide range of platforms) is difficult, is that it's necessary in practice to cope with and have workarounds for buggy and out-of-spec hardware and firmware. Just "coding to the standard" isn't good enough.
Such buggy hardware/firmware is rarely documented as such, and finding these problems and the appropriate way to handle them is painful and difficult work. In some cases the only practical way to figure out what actually works is to reverse-engineer what Windows does (the hardware manufacturers generally make sure that Windows works with their hardware, but rarely make such information public).
Apple's main goal is their own hardware, over which they obviously have a lot of control and information, so they really don't need to worry so much about this.
A device can follow the USB standards (and be labeled as such) while still requiring a special driver. The two are not mutually exclusive.
Having said that, barring a few "recent" desktop distros, as they call it, like Ubuntu, Linux has always been a bottom up approach. You need to install the base, then the userland and so on. For someone who is looking for ultimate customization, Linux makes sense.
But when I am running a notebook, I seriously want everything to work, albeit at a lower performance/customizability.
I always run either FreeBSD or Gentoo on my servers. But my notebook/desktop continues to be OSX.
My own experiences aren't great with Windows, though they're smoother on OSX than Linux. The problem I have with Windows is that stuff does break, but it's inscrutable how and what to do about it, whereas at least on Linux there is usually some way to do something about it if you're tech-savvy. If anything gets borked in Windows, it's wipe-and-reinstall time; back when I used it full-time I probably did that once a year on average.
I find OSX to be poor for package management, though. I've gotten so tired of headaches with installing and managing updates of stuff, and badly interacting upgrades of the OS and separately installed packages and system versions of python/ruby/etc., that I do most of my dev work in Debian in VirtualBox now. In that aspect, it's Debian that's centrally managed with real release-engineering and integration tests, whereas OSX is an anarchic mess of 3rd-party software that nobody's responsible for integrating and testing.
In any case it's all dumb blind loyalty the vast majority of the time. And that applies to Linux, Mac, and Windows. You can't have a discussion about operating systems with 90%+ of people because everyone just defends their favorite and facts are distorted to suit those preferences. It really is just about useless to have a discussion about operating systems as "substantive" or "meaningful" discussions about the subject are about as rare as unicorns.
But if you are Apple pet developer then you'd just use XCode/XCode projects/etc for all your needs. And would be very happy with that.
And well, not using the native toolchain, and using something like mac ports feels like I'm under cygwin all other again. No, thank you. I love 13" Air, but as a development environment Macs are quite broken.
Plus that home key on the full desktop keyboard... If you are linux+mac user, you know what am I talking about :)
Curious, what package management system are you using to install libraries/packages? Ports? Fink? Homebrew?
Same applies to libraries, you can get everything you want with apt-get, as long as you only want things for your architecture. If you need them for another architecture, you are out of luck.
For operating systems you use virtual machines. I use the host just for running my text editor and for my Unix environment needs. All operating systems I've worked on (Solaris, Windows, BSD, QNX, some other real time embedded stuff) have their own tools and compilers used to build the system.
MacPorts, Fink and even Homebrew are FUBAR, I'm glad I don't have to use them.
Without qualifiers, that statement is meaningless. Perhaps you mean it's horrible for some devs? (Millions of others are doing just fine with Visual Studio).
For regular users, dropping them to Linux will be like replacing your mom's car's controls with the controls found in an airplane cockpit(command line) or playing "who moved my cheese ?" with the Linux desktop. Granted, everything is moving to the web these days, so why not just give them an iPad or a Chromebook?
1.) Grandmothers and the like who just want to read their mail and check the news.
2.) People like me who can and will use 4 + 2 + 1 + 0,5 hours to customize it over the course of 6-12 months to have an almost perfect working environment instead of living with a slower OS without virtual desktops.
The only thing I can think of that needs tinkering is some online banks that require applets to work, but then again even Windows doesn't come with Java preinstalled.
I know lots of people who successfully installed Ubuntu that don't know what a boot sector is but are still happy with it. (After all most people use web apps anyway.)
And man tell me where should I have gone to get any information about why my _wired_ Ethernet connection wasn't working on Mac when it worked everywhere else?
[UPD: I'd give iOS apps 2/5 on consistency scale, Android, Linux and OSX 3/5, Windows 4/5].
I can't say I've really touched Python at all, but as a relative newcomer to the Ruby world I found getting RVM set up and functioning properly on my Mac to be a complete non-issue.
It's also true that Homebrew (and Macports, but not Fink) are doing something fundamentally less ambitious than the main Linux package managers: it offers a simple dependency system and scripting for compiling stuff, not directly installing the end product.
Now I don't really expect linux volunteers to do this, but equally I don't really expect any linux distro to provide a coherent business desktop because they are not talking to their "customers" in the same way that their competition is. And the business desktop is a large market, if a bit unsexy.
Well, sure, but (unless I am mistaken) that is no longer on the topic of the Linux desktop.
I know that many of these are trivial complaints and that all of these things can be fixed. I no longer want to have to do this however. In my teens I loved tinkering with the linux desktop, designing 'awesome' Enlightenment themes and enjoyed tracking down and fixing these little problems. Now in my 30s I really just want the OS to get out of the way and let me code. OSX does this well, linux really does not. I say this with a fair bit of sadness as there are many things that I like about linux a great deal - the cli, the excellent package management, the ethos behind the OS and the open source community.
And I've been really, really impressed with the latest Thunderbird/Lightning releases (14-15). Haven't had any problems at all, like I had with the older releases from several years ago. I used to be a Mail guy, but Thunderbird blows it out of the water these days.
I say Mark Shuttleworth is doing this, and no one is paying attention. Unity was a unified vision, it was just different and no one wanted different. And that is fine for a new target audience he wanted for the OS, which was everyone else in the world. Look at what they are doing now, Ubuntu on tablets, Ubuntu TV, they are pushing a unified vision, and nobody is getting behind it en masse in the Linux community because it isn't everything they wanted. They want Cinnamon, they don't like Gnome 3, where is my Gnome 2, etc. The inherent strife is the proble, not a lack of visionaries.
Unfortunately, I think they kind of snatched defeat from the jaws of victory by attempting to go to a "modern user interface". The big problem is that neither designers nor programmers really enjoy incremental progress (designers want a big canvas and programmers want clean code). Gnome2 really was/is good enough for most people use easily AND it was/is simple and powerful not to stand in the way of the existing users (programmers, Linux Geeks, Linus himself,etc). Whether Unity and Gnome 3 work for average people or not, they certainly alienated the existing Linux desktop users and that is an important segment.
I used linux as my only OS sans VMs from roughly 2008 till 2011. I finally gave up because it handled monitor switching, multiple monitors so poorly (using Nvidia GPU). I begrudgingly switched to windows because I owned the hardware already, and it was the only option. Windows sucks, the lack of a usable shell kills it, full stop, nothing else windows does matters. And no, cygwin doesn't cut it.
Fast forward to now, the only computer I manually interact with is a macbook air. The old desktop hardware became my file server(running FreeNAS) despite being many times faster than the air.
Through this experience I realized what I actually cared about...I run four 'apps' 99% of the time: chrome, sublime text(everything non java), iTerm and IntelliJ(java). That is really all I care about, anything that prevents interacting with these applications as quickly as possible fails. The support stuff, git, a database, a message queue, etc. Runs anywhere. I _rarely_ interact with the 'OS' in ui terms. I rarely use finder, I only use the dock to empty the trash (launch everything through Quicksilver). At the end of the day I love OS X not because it has 'awesome' UI paradigms, but because of all the options I see it the least, it may as well not be there.
OS X is the only OS i've ever used that I just don't think or care about. It sleeps, it wakes up, it deals with new screens, it does all the OS shit so I can just use the apps i want to. Thats why I am hooked using it as a dev machine.
It does have some rough edges that may cause you to dismiss it as unusable, but it offers tab completion, colored text output, and a real scripting environment (not DOS batch scripting). Every serious Windows developer I've talked to nowadays uses it exclusively.
No, and I have zero reason to do so. I don't deploy on windows, I have no reason to ever deploy on windows, so why would I bother learning another syntax just to develop on windows? It really doesn't make any sense. Powershell only makes sense for IT admins whom have to manage windows networks, it makes no sense at all for developers deploying on *nix.
Can't speak to the seamless suspend/resume support, etc on your choice of hardware, and I acknowledge that if that kind of stuff doesn't work, it'd be a dealbreaker.
Just saying, Gnome3/Shell is at least worth checking out if you like the OSX workflow and use just a few apps day to day for development.
Thanks for the writeup of your experience!
Yup, still no xrandr support which makes any on-the-fly monitor config changes very difficult or impossible.
Sometimes, if you use twinview and nvidia-settings from the start and if the moon is aligned just right, you can dynamically configure monitors / video ports. Sometimes.
So obvious... Linus promised Linux on the desktop 2000 or so. The reality was different: influential groups just ignored the desktop. Now it's too late, web is becoming the predominant platform, Operating Systems are just commodity. For me it's no big difference whether I use Linux or OS X (with coreutils etc. installed). In fact I wouldn't even mind working on Windows, unfortunately I don't have the patience to setup a reasonable Unix-like dev environment.
The irony is: it has never been less painful to switch to Linux.
This is so true. Linux has a bad rap. I gave up on it a couple of years ago due to the infamous "hidden cost of linux". I switched back a couple months ago and have been pleasantly surprised how smooth things are now. A lot has improved in the last couple years in Linux land.
Which version of which distro?
Is this in Chrome on sites like YouTube? If so, I believe it started with Chrome 20 when Chrome switched to its native Flash driver (see http://productforums.google.com/forum/#!topic/chrome/Mi-YgjN...). This happened to me too, and I still haven't resolved it.
Never underestimate the fear of the consumer that they will break stuff on a computer they are unfamiliar with. The fact that Windows is actually getting better may actually be a good thing regarding other possibilities of desktop software.
cletus you are one of the few people with enough karma to always grab and hold a top post spot who _actually says things worth saying_.
As for "someone with the gravitas", never say never. But they might not use the Linux kernel and GNU as their "clay" or "base material".
I've thought about such a person and one thing I believe is that they would be foolish to try to "sell" such a simple to use system to Linux users (and many of those users would, alas, include engineers like the ones you describe: set in their ways) nor to those engineers who worship Apple, with their belief in some mythical "user experience" [translation: _their_ experience]).
In my opinion, a person with the gravitas would also need to the vision to see that the target user base who is open to change lies elsewhere. The users need to come with an open mind.
I don't know what will happen to Linux. But the binary blob problem keeps getting worse. Too many Linux users are happy to accept the blobs. They don't want to read source code. They just want working devices. That's understandable. But in my opinion it's not a small problem in the long run.
Apple, on the other hand, is flat out abusing its power. Not only over end users but over developers (who are of course just a particular class of end users). They are standing in the way of people's general education about computers. Keeping everyone dumb may give some developers a warm, fuzzy feeling as they watch their bank accounts grow to new levels, or dream of it, but I do think Apple's conduct, seen for what it is, will trickle down past just us fanatics on HN and elsewhere on the web. People are going to figure it out.
If I was a monopoly-lover, if I loved to see "winner-take-all" in IT, as if that improved the lot for any of us (I am not and it doesn't), then my money would go on Amazon. They seem to be making the fewest mistakes. Great things will come from AWS. It is the democratisation of hosting - a sharing of power, risks be damned (unfortunately, it may also mean the monopolisation of hosting as we've never before seen). In "Dropbox", we are only seeing the very beginning of what's possible.
In my view, Apple is a "disabler" (everything they introduce is restricted) while Amazon is an "enabler" (generally: they open many more doors than they close).
We must be doing different things, because I've been on Linux desktops with zero-to-minimal effort since 1999.
And clearly OS X is an extremely polished Unix and is going to appeal to the more UI-focused of the hacker set. And Miquel is definitely among the most UI-focused of the hacker set. He's also an inconsolate "platform fan". Much of his early work was chasing Microsoft products and technologies, of course; now he's an iPhone nut apparently, and that doesn't really surprise me.
But at the same time the Linux desktop was never really in the game. I use it (Gnome 3 currently) and prefer it. Lots of others do. For many, it really does just work better. But in a world where super-polished products are the norm, a hacker-focused suite of software isn't ever going to amount to more than a curiosity. (And again, I say this as someone who will likely never work in a Windows or OS X desktop.)
So in that light, I think the idea that the Linux desktop got "killed" is sort of missing the point. It's no more moribund now than it was before. It's more fractured in a sense, as the "Gnome" side of the previous desktop war has split into 3+ camps (Unity, Gnome 3 and Gnome2/Xfce, though there are other spliter camps like Mint/Cinnamon too). But it's here and it works, and it's not going anywhere. Try it!
I strongly disagree. It is losing exactly the sort of person that the author is: developers who, all else equal, would rather use Linux. But who eventually get tired of the BS and just want something that works and you can actually get software for. I have a lot of sympathy for that.
I write this from a Linux laptop, but that's more out of mulish stubbornness and 15 years of accumulated irritation with Steve Jobs and his dickish business practices.
The last time I got a new laptop I knew I didn't have time to screw around for days with X configuration files. And so I paid a vendor several hundred dollars over list to give me a laptop that JFW. And despite that the sound is still way too quiet. After a few time-boxed 2-hour excursions into whatever sound system they're using this week, I still can't fix it. I've given up.
The only legitimate reason I have for staying on desktop Linux is that I code for Linux servers, and I think it's impossible to really understand system performance if you're not running the same OS. But even that seems shaky to me; hardware keeps getting cheaper and developers keep getting more expensive, so it just doesn't matter as much.
One day some bright Linux spark is going to "innovate" again in a way that I'm expected to put up with their rough edges for 6 months (hello, Unity!) and I'm going to say fuck it and buy a Mac because I just don't have time to screw around right then. Or maybe I'll just want to watch a Netflix movie without hassle, or play the video game my pals are all talking about. And maybe by then it will be a fancy dock for my 8-core Android phone.
Overall, I agree with his point, except that I don't think Linux on the desktop is so much dying as cutting its own throat.
But for the past two and a half years I've used Ubuntu with Xfce, and those problems have become a distant memory. Nothing breaks, nothing gets changed on me (Xfce has moved a few things around, but nothing too terrible). No one forced me to switch to Unity, so I didn't. For me, Linux is more usable and stable than it's ever been. I also seem to see more people running it on the desktop than ever before, and in fact stats from Net Applications show a 50% rise in market share in 2011: http://www.zdnet.com/blog/open-source/is-the-linux-desktop-a...
Inside the Apple-centric echo chamber of HN, it's easy to believe that all the developers have moved to OS X and desktop Linux is dead, but I disagree. Despite its problems, desktop Linux is secure in its (small) niche.
I don't really believe developers are moving to Apple gear because of HN. I believe it because last time I went on a hiring spree, a lot of people said, "Oh, you're running Linux? Do we have to?"
Fair enough, but why not wait to use it until they get there? Daily reliability problems seem like a lot to put up with just for some cool ideas.
Also, I'm not sure how your interviewing experience supports the conclusion "Linux is dead/dying". What it does seem to suggest is "Linux is unpopular", but that has never not been true (on the desktop).
My previous interviewing experience wasn't like that say, five years ago.
Already installed stock Ubuntu? Install the xubuntu-desktop metapackage (in Synaptic or 'sudo apt-get install xubuntu-desktop'), log out and select "Xubuntu Session" from the login screen. That's it. I did this on my laptop after realizing I didn't want to switch to Unity. (For bonus points you can remove the Unity desktop apps, but that's hardly necessary unless space is at a premium.)
I don't think I'd put up with Linux myself if it was as much of a pain as you seem to think. For me, Xubuntu has been no trouble at all.
All I do is have this line in my .xinitrc:
[[ -x /usr/bin/start-pulseaudio-x11 ]] && start-pulseaudio-x11 &
# pacman -S pulseaudio pulseaudio-alsa
# pacman -S pavucontrol paprefs
I much much prefer the simplicity of only using what I at least partially understand - the more magic (ie a DM) the more shit to go wrong that you don't understand.
When I first tried to use Linux about 7 years ago, wireless drivers were a huge problem. Manufacturers didn't provide assistance--not much Linux distros could do about it at the time.
These problems are simply inherent to Linux being a minority platform.
What the OP is complaining about is the same thing Miguel was complaining about, and it's the exact same thing JWZ called out 9 years in his CADT rant. It has nothing whatsoever to do with manufacturers failing to provide drivers and everything to do with attention-deficit devs never wanting to knuckle down and do the hard, unglamorous work of long-term maintenance and bug fixing.
Working systems (with known bugs) are thrown out and re-written as new, incompatible systems with even more bugs. Everything breaks every time some idiot decides that they'll rewrite the audio/desktop inter-op/init/logging/whatever subsystem because This Time It'll Be Done Right™. This perpetual treadmill of half-working betas never ends.
It gets old.
I suspect part of the reason people spend so much time with advocacy is that popularity does pay off, long term: popular platforms get more support, more applications, and more other people who can help you with your own problems.
Oh, but it is. Because it has lost a lot of momentum that it had, momentum that was coming from the "we're gonna overtake MS and win over the Desktop" feeling prevalent at the time.
Heck, the guy behind GTK complained recently that he is just one man taking care of the project. The full GUI foundation for Gnome, and one that is far from feature complete at that, and it only has one guy working on it.
It has also lost a lot of people and companies associated with it at the time betting on this possibility [of it winning the desktop]. Most companies nowadays support Linux development only for the server stuff, but it wasn't always so. Miguel moved on, Ximian moved on, Hazel moved on, Rasterman moved on, etc etc. Even Adobe quit developing Flash for it.
And it also lost a lot of "alpha geeks" to OS X, which wasn't even commercially available at the time (1997-2001). This "desktop UNIX" come out of nowhere and it was it that did what Linux was supposed (and expected) to do, ie eat into MS market share. Well, even OS X didn't eat that much, but 15% is still a lot.
>He's also an inconsolate "platform fan". Much of his early work was chasing Microsoft products and technologies, of course; now he's an iPhone nut apparently, and that doesn't really surprise me.
You're saying it like it's a bad thing. Gnome (or Linux for that matter) are also platforms.
And to be frank, his early work was not "chasing Microsoft products and technologies". His early world was Midnight Commander, Gnome, Gnumeric, and Evolution.
He indeed like the component model (not the Window platform itself) when it was presented to him at a Microsoft visit, though (the story of this visit should be up there somewhere). But what's not to like about it? A good, clean, component model of sorts was also needed for FOSS, maybe still is.
The phase you describe, IIRC, was several years _later_, when he saw the .NET platform and got hooked.
> His early world was Midnight Commander, Gnome, Gnumeric, and Evolution.
Clones of Norton Commander, Windows, Excel, and Outlook. To be fair, Gnome 1 wasn't really a "clone" (though it did mimick more than innovate) and mc was chasing a Symantec product, not a Microsoft one.
But to claim that these were innovative new projects is silly. Miguel's career has been one of seeing something he loves in an existing product and duplicating it in his preferred free software environment. There's no shame there. But it's absolutely the same thinking that drove the Mono project.
Gnumeric was the product of the mood in the early days of Gnome: we need to provide this to have a complete desktop offering and we were talented hackers that could get it done. So I did.
Once I started, I enjoyed writing a spreadsheet every second of it. It was both a very educational process, but also the one that made me grow fonder of strict compiler warnings and errors. And from this experience, most Gnome software after this was compiled with warn-as-error.
But I am glad you enjoy Gnumeric.
Well, for one, C#/CLR was not a favorite product when he fell in love with them, were they? At the time lots of people thought MS was foolish to try to overtake Java, and it took a couple of years for people to accept and like .NET.
>But to claim that these were innovative new projects is silly.
Sure, they weren't, but it's not like Linux in general has given us much innovative things in the desktop space.
Two off the top of my head: workspaces, app stores.
As for appstores, I assume you're talking about the packaging systems, which are similar only if viewed from 50,000 feet.
The packaging systems were not stores. You couldn't buy packages. They didn't serve as an agent to sell other people's personally submitted packages. They had "apps" but no "store". The applications weren't sandboxed to make it relatively safe for them to sell 3rd-party submitted packages. The applications were expected to be open-source.
The only similarity to the app stores is that they had a repository of software and an automated installation system used to install it.
I didn't mean packaging, I meant app stores, see Lindows and CNR.
I have no experience with desktop app stores, but you wouldn't get Gnome and similar large and complex software from one, would you?
Which makes your comparison pretty unfair.
For instance, cars were not an innovation according to hacker news, because Horses!
Much older than Linux.
You mean package managers. Similar in functionality (getting programs) but not the same thing. Both have nice things going for themselves though.
(And, there are of course Linux-based systems that were built by someone controlling the whole experience, and those work really well. Android and ChromeOS come to mind, though those aren't really desktops per se.)
The other day, someone here was complaining about udev. It has ruined Linux forever, or something. I have a different experience: udev has made my life very easy. I have a rule for each device I care about, and that device is automatically made available at a fixed location when it is plugged in. For example, I have a rule that detects a microcontroller that is waiting to be programmed with avrdude in avr109 mode that symlinks the raw device (/dev/ttyUSB<whatever>) to /dev/avr109. I then have a script that waits for inotify to detect the symlink, and then call avrdude to program the microcontroller. A few lines of shell scripting (actually, it's in my Makefile), and I can just plug in a microcontroller, press the programming button on it, and everything just works. No screwing around with figuring out which device address it's assigned to. How do you do that in Windows?
Want to log in from a terminal and start X? Well, too bad there's some new infrastructure this month that makes sure you'll only get access to the sound device when you log in from a properly configured gdm. Want to modify the keyboard layout? Whoops, the xmodmap format that had been stable for a couple of decades now changed for the third time in a year. Want to add a tmpfs on /tmp to fstab? Well, too bad. Some implicit and undebuggable circular dependencies in systemd will make the system unbootable.
And the sad parts are that the problems this new infrastructure is supposed to solve never actually existed. "Great, now audio doesn't work at all. But if it worked, it would have full support for network transparency". It's easy to understand why the problem exists - creating something new tends to be more rewarding than working on the old stuff. It's much harder to see how to fix this madness.
You can't expect everything to work when you tear out components then fail to configure things properly...
The truly toxic part is that every single transition adds complexity and reduces transparency, making it harder and harder to understand the system. It's just not the breakage alone, or the complexity alone, or even the lack of transparency. It's the combination of all of those.
In the start of this thread jrockway proudly says that all you need is deep understanding all the components and the system as a whole. Back in the day this was not actually an unreasonable thing. But it's been getting less and less reasonable for a long time.
(Incidentally the audio example in my original message wasn't even directly related to pulseaudio. It was a few years back, but IIRC it was some daemon tweaking the device permissions, and something else adding users to a special group in the GDM login path but not the console one.)
About the transitions supposedly adding complexity; I don't know about you, but systemd, for example, has greatly simplified configuration and management of services, mountpoints, timers, and all sorts of things.
I use awesome in Fedora 17 on some early-2010 entry-level consumer intel hardware, everything is zippy, easy to configure, and doesn't break on me all the time; I feel bad for you man.
Requiring everyone to be an expert will prevent Linux from being a dominant OS, because most people do not care enough to gain a deep understanding of any OS, so they'll pick one that's easier to learn the easy bits.
Agreed. That is two desktops and four laptops worth in a decade.
"I think the issue is that getting everything working requires a deep understanding of each component and the system as a whole. "
Not so sure. I'm an end user and tend to just shove the CD-ROM in and cross my fingers. Most 'stuff' works. My stuff is simpler than yours however (audio interface, cameras, keyboard (musical) controller)
 Last time I used ChromeOS it was still mostly in line with ordinary Linux (thought it might have been partially Gentoo based?); but a rather impractical one as it's less than simple to run [things that aren't Chrome]. Heard about Xorg getting replaced at some point, I'm unclear on if that actually happened.
But that's exactly one of the major problems he's complaining about!
Also, part of what killed the Linux desktop was Miguel and his total lack of understanding of the unix philosophy which drove him to create abominations like BONOBO. D-Bus is not much better either.
That he fell in love with an iPhone goes to show he didn't fully appreciate the value of open source either.
We were just yesterday commenting with some friends in #cat-v how Evolution is one of the worst pieces of software ever created, and Evolution is supposedly considered by Miguel and co to be the epitome of the Linux desktop.
Yes, there is still Android but, given how modified the usual handset is, you can't buy a free and open Android on the market as well.
So, where is the problem in using a closed device when "open" just doesn't deliver?
There are many reasons for which a thing can be loved.
I'm not a fanboy but I did envy my wife's iPhone for years. I like my Android better today but what choices did you really have when the iPhone came out? It was the only game in town. I was chained to a Blackberry is the only reason I didn't have one.
When i'm not on Linux I run OSX everywhere else (and IOS) because its unix-like (is) and because it works so well. I am sure Windows 7 and 8 are great, but I doubt they have gotten rid of c: or \ as path delimiter or any of the other nonsense that DOS introduced (copied from PIP) back in the dark ages. why should they, MSFT still runs DOS apps so they aren't going to change and choosing between OSX and Linux on a non-work desktop is a no-brainer, Netflix, Photoshop etc etc etc...
Let me give an example: a few months ago, a new version of Skype was announced for Linux. I was excited, since I used Skype 2 for Linux but then it stopped working for me and I couldn't be bothered to fix it. But if you go to the Skype for Linux download page, you will find a few downloads for specific distros, then some tar files which are, statistically speaking, guaranteed not to work.
Long story short(er), I still don't have Skype working on my desktop, because my distro isn't in the list, I can't get one of the other distro packages to work on my system, and of course none of the statically-linked binaries work.
(I could almost certainly get it to work if I was willing to install 32-bit binary support. But it's 2012. If your app requires me to install 32-bit binary support, I don't need your app that badly.)
Steam for Linux, recently announced by Valve, will run into the same problem. I suspect it will actually be Steam for Ubuntu and Debian, possibly with a version for Fedora, assuming you have the proper libraries installed and are using the right sound daemon and graphical environment.
But if big-name software comes out for Linux, hopefully distros will get in line. Do you want to be that distro which can't run Steam? Doesn't really matter if you think that OSSv4 is superior to ALSA and PulseAudio...if Steam requires the latter, you will toe the freaking line, or disappear into obsolescence.
One of the big thrusts of the Linux desktop wasn't simply dominance itself, but for it to simply not matter what you were using on the desktop. The Linux desktop fought to produce the first cracks in Windows hegemony a decade ago, but the final push came from the rebirth of Apple and the rise of the smartphone.
Today people happily do their normal productive or recreational tasks from a variety of computing environments: Windows, GNOME, Unity, KDE, OS X, iOS, Android, et al. Probably the majority of (Western) web users use at least one non-Windows internet device.
During the golden age of the Linux desktop everything seemed predicated on reaching exactly this point -- that you wouldn't need Windows, and then, by virtue of competing on a leveler playing field, the Linux desktop would ascend.
But the Linux desktops didn't "scate where the puck is going" -- or their attempts at such missed the mark. By the time we reached the era post-Windows dominance, the Linux desktops weren't positioned to take advantage of the new playing field dynamics. The rest of the industry isn't even all that concerned with the desktop wars anymore. It stopped mattering very much -- and ironically, that came around to bite the projects in the ass that first got the ball rolling.
The author seems to forget that some people actually enjoys configuring and hacking their systems in detail. Also there is the people who hates using the mouse, and wants to do everything with the console and keyboard.
Having a bigger market share means that hardware/software vendors are more likely to consider supporting Linux to at least some degree.
For example I like to listen to music while programming, so it's nice that Spotify is available on my dev machine.
This is completely missing the point - a statically compiled end-user binary should be compatible across all distributions of Linux, using the same version of the kernel or any newer version.
The only caveats to that are (a) hardware and (b) poorly-packaged software.
(A) is the fault of hardware manufacturers and is increasingly not an issue these days anyway; driver issues are becoming increasingly rare.
(B) is easy to solve for any open-source software, as it is the responsibility of the community for that distribution to provide the appropriate packaging. They prefer to do it themselves. And they're good at it - it gets done!
If you want to ship a closed-source binary on Linux, just make sure you don't dynamically link it against any libraries that you don't also ship with the binary. Problem solved.
Honestly, I can't remember one single instance ever where I have run into end-user software that will run on one distribution of Linux and not another, as long as that principle was followed.
Consider D-Bus, if you statically link, but the system changes the format or location of D-BUs configuration files, all of a sudden your app no longer works.
So in theory, yes, this could solve some of the problems. But it requires a massive effort to make the Linux desktop libraries static-linking friendly, which they are not.
Why not just ship a chroot jail to run it in in case some of those statically linked system libraries read config files which might be under a different path or in a different format?
A lot of applications break on newer versions of Mac OS X. That's why there are websites like http://roaringapps.com/apps:table
Also, there are a lot of "transitions" that Apple loves doing: PowerPC -> Intel. Java -> Objective-C. Carbon -> Cocoa. 32-bit > 64-bit. Access everything -> Sandbox.
See also Cocoa docs: "method X introduced in 10.5. Deprecated in 10.6".
I have a few devices that don't work in 10.8.
Basically, what I'm saying is that OS X is a bad example for backward compatibility. Windows is much better at this. Open source software is much better at this.
Now take the mobile world for example, Linux on mobile had been around for a decade but it never really took off until a huge company like Google decided to throw its billions of dollars and its great ingenuity at the task. Getting an OS to be popular is just incredibly difficult and it needs way more than just good driver support and/or good software. It needs marketing, talking to manufacturers, dedicated and well payed devs, designers, UI and UX professionals, sales, R&D and so on and so forth.
Focusing on the technicality of drivers and API is typical of us devs, but it has nothing to do with why Linux didn't take off on the desktop, sure Linux did fail because it couldn't do any or some of that well, but why couldn't it do any or some of that? Because it didn't have a huge and focused company pushing for it. How many popular desktop OS are there? Only 2, I think that's enough to show that it's incredibly hard to get into that market and that only a huge company can make it. Also, let's not forget that Windows was good enough and there was not much Linux could do to attract users, in fact this is still true and probably why even OS X is still at 5%: Windows is good enough and it's the de facto standard used by +90%. Having the best UI and UX in the world like OS X doesn't help that much either.
No, RedHat bet the house on server Linux, never on desktop. Canonical did bet the house on desktop linux bit it's a tiny company that has yet to turn a profit (or a significant one).
Some people will probably tell you all kind of reasons why this is bad, but it works.
When I go home, I'll be using my personal laptop running linux. My wife and kids run a netbook with a linux desktop.
The linux desktop may be dead to Miguel, but it works just fine for me, a lot of other people in my life, and a lot of people in the world.
I never consider OS/X despite some fanciness; it is limited in choice of hardware, it is totally dead on server side, its unix-ness was/is crappy, despite being developed by the (the most valuable) cathedral in the world. its support for open source dev tools is miserable, despite some recent improvements, its game support is miserable (compared to windows) and it goes worse in openness day by day. Just a crowd of ordinary users, waiting to be sold trivial app store apps maybe an appeal to some developers, but OS/X appears like a toy casio personal organizer OS from 80s to the DIY/Linux developer desktop users.
What killed the linux desktop? Drivers. Mostly graphics drivers but some others as well. Who cares if the UI isn't ideal if the damn thing can't sleep and wake up properly, or if it spazs out every time I plug in an external monitor.
Stupid jokes aside, I always had a feeling that the Linux desktop was actually gaining users in the last couple of years, especially since Ubuntu came along and truth be told it's still the best operating system I've been using but that question is largely open to taste anyway.
For the last couple o' weeks I had to use Windows 7 at work and oh my god, what a pain in the bottom it is! I've had a couple of looks at OSX and never liked it either. Then again my linux desktop isn't exactly standard either - but that's what I like about it: customizability. I'm the kind of guy who likes to tinker with the system until it's just about right for me. The last time I set up my laptop it took me 3 or 4 days to finish but it absolutely paid off. The result is something I find aesthetically pleasing and extremely usable.
Linux has a couple of show-off features and facts 'n' figures like running most of the Top 500 and second-to-none package management but it's real power is that it can very well be all things to all people. It's been said a million times but Linux gives the user a choice, and I honestly value that a million times more than any nice OSXesque UX or Windows-esque games support.
The catch is that I'm definitely not the average user who wants things to "just work" (although they rarely do).
That seems a weird way to phrase it. This is not a new thing, and generally drivers have gotten better over the years. Saying that drivers inhibited the growth of Linux makes sense, saying that they killed it -- not so much.
I don't actually see any problems with Linux. It's just not meant to be.
I'm sure you'll blame this on AMD's drivers or shoot out some other random technical reason. While i understand the technical issues, I just don't care, I have far better things to do than care about this shit. It flat out doesn't work and I don't care why.
I spent 5 years using linux as my only OS, I am very well aware of the issues it has, specifically in the area of display management. The reason I stopped using it is that I don't care to mess with it any longer. I've spent hours mucking with the internals of Xorg and writing sleep/hibernate scripts in an attempt to duct tape something together that works. I've spent hours trying different combinations of kernel + Xorg + driver to figure out which one won
t randomly peg a CPU core at 100%. I've spent hours researching parts to look for compatibility, I've wasted $ on parts just to work around issues with linux.
What I realized and why I have no interest in linux as a usable desktop anymore is that while this time spent may scratch a nerd itch, it is actually a complete waste of my time and detracts from actually doing something productive, like using the computer to produce, the very reason I own it.
So I don't give a shit at all why it doesn't work anymore. If it doesn't work, it doesn't work, I don't need to do massive research to say "it doesn't work". I don't need to be "well-informed" to say it doesn't work, not working is pretty obvious. In fact doing research goes completely against my goal, which is to spend the least possible amount of time thinking about my computer and the most possible time producing.
However, no way in hell anyone will get me to switch to Mac OS. I am simply too enamored with having an environment that I can hack on if it strikes my fancy, as well as an environment that I can customize how I want it. Despite all its flaws, it still does focus follows mouse pretty well, and not having that would drive me batty.
Also, Apple is an 800 pound gorilla that has always been about Being In Control. The Samsung lawsuit wasn't anything new:
I just don't want to be part of that kind of walled garden.
I don't really see how the Linux desktop is dead. I've been running the same OS on this same laptop since 2007. The only upgrade I've added is an SSD and an extra gig of memory. It's still pretty speedy and I've never had any problems.
I use a Macbook Pro with OS X at work because that's just what I was issued by default. I hate it. I hate the over-reliance on the mouse, on gestures, the abundant and tedious animations; I hate the crappy ecosystem of package repositories and how most of the packages are broken or completely mess with the system; I hate never being able to find where any of the configuration files are or where something is installed; I hate the plethora of ways you can start and stop services; the confusing GUI; the masochistic meta-key layout; the awful full-screen support; and the complete lack of customization options.
I've had much better experiences with the Linux desktop for 95% of the things I do.
Now before some OS X fan-person decides to point out how woefully misguided and ignorant I am, my point is that there are different folks out there who want different things from their desktop experience. Apple gets to decide top-down what that experience is all the way down to the hardware. I prefer a little more flexibility. I like being able to swap out my own battery or adding a new memory module when I need one. I like being able to switch from a GUI desktop to a tiled window manager. Some folks don't -- there are Linux distros that hide as much of that as possible. Either way there are plenty of options and I think that's a good thing. Competition breeds innovation and even though I don't particularly like Unity I am glad to see people trying new things.
The Linux desktop isn't dead. It may just smell funny. You may switch to OS X and wonder why anyone could possibly want anything else. I just gave you a bunch of answers.
As well as Linux's presence in the data centre, witness the success of 'embedded' Linux: many TVs, routers, set top boxes and other bits of sealed-box electronics all run on it. It's broad in its scope because of the large team of divergent interests working on it, and it's able to support those systems because it's been well made as a direct result of that team's philosophy. Is it really so bad that the average Facebooker does't want to use it?
It really is very, very hard indeed to be all things to all men and no single system around today can make that claim. Linux has its place in the world of computing, just like Android, Windows, OSX and everything else.
Did you know that X11 is why we have shared libs (the UNIX version of "dll hell")? If not for having to run X11, shared libs really would not have been needed.
There are many window managers. Maybe too many. Too much choice for a noob. That selection or the pre-selections Linux distribution people make does not equate to "the" Linux Desktop. It equates someone else's configurations and choice of applications. It equates to having to fiddle with X11, whether you are just configuring it or developing programs to run in it. And that has always been extremely frustrating for too many people- constant tweaking; it never ends. This is like a brick wall to people who might want to try Linux, coming from Windows. You are inheriting a system that's been configured to someone else's preferences. (Same is true with Apple, but they have a knack for making things easy.)
I skipped Linux altogther and went from using Windows to using BSD. I've also been a Mac user. And BSD is way better than OSX, or any of the previous MacOS's for doing most everyday things: email, internet and secure web (ramdisk). Moreover it's flexible - you can shape into what you want - without this being an overwhelming task of undoing someone else's settings.
If you want a citation for the shared libs thing I will track it down, but honestly anyone can do it on their own. The historical research will do you good. Educate yourself.
All of the article's criticism of mainstream workstation distributions is accurate, of course. But it's important to note that those represent nowhere near the sum total of the linux user experience these days.
Android is only "Linux" when it's convenient for Linux advocates, but it's never "the Linux desktop".
This developer culture DEFINES Linux. A fruit is either an apple or an orange. I couldn't have an OS with wonderful package management, developer tools, endless configurability AND a desktop Miguel de Icaza dreams of.
I'm on Linux now (GNU/Linux, maybe lump BSD in there too, I'm using "Linux"). I know plenty of users on Linux. I know plenty of users of Windows and OS X who run virtual Linux Desktop distributions for testing/development/security. I'm sure some of HN are running Linux.
Does Linux have the potential to enter the market as a third core option for desktop usage - not really. But why does it matter?
The problem with Linux is that there are too many choices. People who like technical choices and options trend toward Linux (needs citation).
John Q. ComputerUser isn't going to use Linux unless his geeky son or nephew installs it for him AND provides support. He can't get support anywhere else - because there are too many possibilities for it to be fiscally effective.
If/When something gets confusing or broken on Windows/OS X, you call JoeBob's SuperDuperPuter, and say it's broken. JoeBob asks, "What Windows version?" While he might need to poke and pry a bit to get the user to tell him he's running Millenium edition, once he gets that data, it's a pretty straightforward troubleshooting effort and fix.
If you call some mythical Computer Service group that actually supports Linux, and say your machine is broken, they would need to know a LOT more about your system just to figure what they need to do to start.
Distribution? Parent Distribution? Shell? Window Manager? Hardware? ...
I find generic computer service companies to be extremely expensive. To be able to provide even basic service for Linux in general, your techs need to be very familiar with more operating systems (emerge, apt, yum, zypper, pacman), and more core applications. Each service effort inherently takes longer. These factors pile up and everything becomes necessarily more expensive. It's downright impractical to support Linux generically. The support costs for one or two issues on Linux would far outweigh the cost of an upfront OS license and cheaper support for the end user.
Linux has (and will likely continue to have) a comfortable hold on the technically-capable DIY market. It may not be on track to step beyond that market in the desktop arena - but that certainly doesn't indicate it's time for a toe tag.
Most of the time all that happens is virus scan -> backup -> reinstall.
Having been a small-scale Mac developer for many years, that really made me chuckle. Not since OS X 10.2 did Apple release a major upgrade that didn't break my apps and make me struggle to push an update out as quickly as possible to fix all the things that Apple broke. Apple has heard of deprecation, but they don't seem to really grok the concept.
If I had been developing for Linux, I could have simply tested on pre-release versions of the distros I wanted to support and would have been ready when the new versions were released. On OS X I would have had to have paid a prohibitive fee for that privilege.
In any case, this article made me happy. You see, for so many years, I used a Mac, and everybody said "Apple is on its last legs; the Mac will be dead in a few years". Apple had to scramble to compete, and that drove them to provide such a good product. But I knew that situation might not last forever, and I was right. After seeing the turn that Apple had taken over the last few years, I switched to an Ubuntu laptop six months ago.
It's refreshing, once again, to be using an OS that people are calling "dead".
ADDED: jrockaway's comment, added while I was writing this, hits it just right: "I think the issue is that getting everything working requires a deep understanding of each component and the system as a whole." Which is what makes it so frustrating, even to very intelligent people who have other interests than computers in and of themselves.
I've been using Linux for the last decade and every year it gets better, more polished, more integrated, featuring a better design; I hear more & more people talking about it and using it. Linux is more alive than ever on the desktop!
Depending on your needs, Linux can make an exceptional desktop. Yes, true, it is not for _everyone_, but then again neither are Windows or MacosX.
Personally, I've been primarily a Mac user since the Mississippian superperiod, but I used an X-11 Windows(™) environment (on top of FreeBSD) for years at work. I don't miss it, even one iota, but I know plenty of smart people who prefer that sort of thing. De gustibus non disputandum est and all that.
Additionally, OSX is no linux replacement. Bash is completely different except for cd, rm, and ls.
Linux desktop is an OS for people who are willing to invest time to get an utterly fantastic, fascinating experience. It doesn't suit modern instant-gratification culture, no, and yes, it takes experience and expertise to get the most out of it, but once you get there, it's bloody amazing.
I'd recommend having it as a small hobby as well as tool, so you can spend time learning and figuring things out and broadening your knowledge.
Disclaimer: Using Arch Linux with the i3 window manager. No, it's not for everyone, but I absolutely love it.
I also have a Mountain Lion MBP, but AFAICT, I will be leaving the Apple cart after it dies and I'll be 100% Linux.
I need it for work, and I wouldn't have it any other way.
Linux has been for those that like to get dirty and it is doing that job quite well. Canonical came a bit late to the party and wasn't large enough to matter. RHEL just went after the servers. To make a fair comparison, Linux should have had a big player backing it strongly on the Desktops / laptops 10-15 years ago (like Google is doing now with Android). HP and IBM did their half assed attempts, but they were never really behind it completely.
I have a Mac, and use it for some things, at times. It's nice, for sure, but I love the openness of Linux, even though, of course, there can be many very painful hardware issues (video, sound, etc), all of which I have experienced at one time or another.
I am wondering - I hear Google is working on a "Android desktop". Would that perhaps maybe change things regarding the "Linux desktop" a bit?
An Android desktop will be a proprietary desktop built on top of an open source kernel.
It might be terrific, but it won't fulfil the dream of a free software OS and desktop.
Now if I need to fire up Linux for a project, (usually for a microcontoller or such hardware that needs C), a virtual machine or appliance that I can launch from Windows 7 does the job. This is also how I keep Windows 8 contained, safely in a virtualized box that I don't have to deal with it, unless I need too... ;)
By the way, in my oppinion only a small fraction is buying Macs because of OS X, it's the Hardware. Design and Usability of Ubuntu is a lot better than OS X at the moment.
As I wrote on my blog recently:
"In the [past three years], Linux has grown — albeit slowly — in desktop usage. After nearly 2 years of no growth (2008-2010, lingering around 1% of market), in 2011 Linux saw a significant uptick in desktop adoption (+64% from May 2011 to January 2012). However, Linux’s desktop share still about 1/5 of the share of Apple OS X and 1/50 the share of Microsoft Windows. This despite the fact that Linux continues to dominate Microsoft in the server market."
It may be in third place in a desktop market with primarily three OSes, but usage has never been higher.
As I discussed in this article, most of the original reasons that stopped Windows / Mac users from using Linux years ago are no longer valid. However, the irony is that it's easier than ever to get by with a Free Software desktop, but harder than ever to avoid proprietary software and lock-in, thanks to the rise of SaaS and the personal data cloud.
Modern SaaS applications accessible through the web browsers using open web standards are the modern equivalent of an open source Perl script wrapping calls to a closed-source, statically-compiled binary.
You can read more about my thoughts on this in "Cloud GNU: where are you?" http://www.pixelmonkey.org/2012/08/18/cloud-gnu
the whole talk by itself is very recommendable: http://youtu.be/MShbP3OpASA?t=23m45s
I think the Linux desktop simply has more options for experienced users. I simply see no way how I could be more productive with a GUI designed to cater to lusers.
It's getting really irritating when someone who's jumped ship to OSX declares it "dead" because they have a shiny iDevice and an expensive laptop.
This is compounded by most distributions having a lack of centralized vision on how everything fits together. They are merely a collections of individual parts rather than a collection of parts that are designed to work well together and they lack the polish as a result. While the lack of centralized vision was fine for SunOS circa 1992, it simply doesn't cut the mustard in 2012.
Ubuntu seems to be trying to push such a centralized vision with Unity, but I fear they lack the clinical editorial willpower to make the hard decisions required to see it through to its ultimate conclusion.
But the GNU/Linux project had a very different objective. Fighting for freedom. If it is still freedom the driving force, then we should encourage the enthusiasts and get back to work on improve Linux, as it has been done for the past years. By doing so Linux already reached the excellence in some fields.
If you're just competing on features, you'll be missing some great benefits and enjoyment. And to be honest, in terms of features OSX isn't that good either as Windows is still used by the majority for one reason or another.
But anyway, a more interesting question could be: What does it take to bring an ex-linux user and now happy OSX user back to linux?
I used Windows for 3 years, then linux for 2 years. During that time I did a lot of installations (mostly ubuntu and debian) on a lot of different devices. During this time, while fighting with drivers, minor display problems, and spoiled windows users I lost my faith in linux as a desktop os and switched to OSX.
I can just speak for myself, but this few points would bring me back to linux in no time.
Presenting Distribution "Utopia"
1. No X11 based display stack, it is replaced with something conceptually simpler (like cocoa).
2. (Multiple) monitor recognition 100% accurate. (Probably connected to Pt. 1)
3. The audio setup is not much worse then the one of OSX.
4. Throwing Gnome and everything that is based on Glib out. It's 2012 there alternatives to faking oo with C. Qt isn't allowed either.
5. Throwing APT out. No more dependency management for a desktop OS please. Then kill Perl as requirement for running an os.
Ahhhhh, I feel better now :-).
This is the opposite of what Miguel demanded, he cares for backward compatibility.
When I think about it. "Utopia" would be similar to Android. No fear to throw old stuff out.
Android as a foundation for a new desktop linux?
Unsanity ShapeShifter hasn't worked since OS 10.4
and I know about
http://magnifique.en.softonic.com/mac - 10.5 only
http://www.marsthemes.com/crystalclear/ 10.7 support claimed, but it's not very radical. I'd love xfce's window look controls or a Stardock windowblinds.
I know Apple don't want anybody to do this. I know they will deliberately introduce changes that break hacks. But as I said, how can it be more effort than Linux?
2. To try to prevent OSx86 hacks, DSMOS.kext uses crypto to prevent the system running essential UI elements like Finder, SystemUIServer, etc. Can't we build our own versions of those parts?
3. Is this true?:
Linux desktop - dying, dead
Windows 8 - trying so hard to copy OSX/iPad/Springboard/Launchpad that everybody is gonna hate its TWO UI's! (dying?)
Mac - winning, won (by default?)
I had run a Linux desktop (a Debian build mostly w/ KDE) for a while and kept getting hammered with random stuff breaking for random, and often poorly considered, reasons. I gave up and went back to running a Windows deskop with a X-server to pull up windows on my Linux box.
Then I went to work for Google and they did a really good job of running Ubuntu as an engineering desktop (calling their distro Gubuntu of course) and I thought "Wow, this has come quite a ways, perhaps Linux has matured to the point where there is at least one way to run it reliably. And so I installed Ubuntu on my desktop and tried that for a while.
For "using" it, it was for the most part ok if once every few days I did an apt-get update/upgrade cycle. For developing it was a real challenge. Pull in the latest gstreamer? Blam blam blam things fall over dead, update their packages (sometimes pulling git repos and rebuilding from source) to get back working, and now apt-get update/upgrade falls over the next time because you've got a package conflict. It is enough to drive you insane.
I have Windows 7 on the other partition mainly to play games.
There was a minor issue with Ubuntu trying to melt the CPU in my laptop the other day, but its not so bad since I upgraded, and I found this powertop thing that also helps.
Now, with all the mentioned above I do wish there was a better ecosystem for app development. I mean something like xcode 3 not 4. Yes we have QT, yes we have glade, but build an app with the interface designer and bindings, mvc concepts and it just helps a lot.
You can do most of it with Vala, granted, it's just shittier documented and not as "round", there are no standard concepts to follow, etc. And yes, I do like my linux customizability, but we have stuff like CERT best practices for secure C coding. Why can we not get something like that for linux gui programming.
ps. gnome3 can go right where it came from
I switched to OSX for exactly the reasons the author mentioned. The fact that I have an awesome UI + ability to use the shell all day is a huge win for me.
However, people cannot switch to OSX. The machine that ran Windows XP and Linux WILL NOT run OSX. So, no, you didn't switch to OSX, you purchased new hardware that is strictly controlled (motherboard, video card, etc).
Then, in an incredible blunder, de Icaza said, "Many hackers moved to OSX... working audio, working video drivers, codecs for watching movies..."
Uhhhh, yeah, if Linux gave up on supporting scores of hardware platforms and hundreds or thousands of hardware components, OF COURSE it would have working audio, video drivers, and codecs.
Needless to say, the sound didn't work. And the wireless didn't work. When I clicked "suspend", it said it was out of swap space. When I closed the lid, it crashed.
And since most things are web-based now, a vanilla KDE setup + Icon-Only Task Manager is awesome enough for me.
That aside, what we have here is a thread apparently devoted to shitting on the work of people who built something for fun and gave it away for free.
Good job folks!