However, lately I am shocked about the "advances" of the Linux Desktop: most of them are crap... And that is not just me, but also the feedbacks of the users that I see complaining... I know I will be downvoted with this post, but I must share my experience.
- PulseAudio was half-baked, pushed-down our throats by Ubuntu/Fedora, and hated by many users; with a very strong NIH syndrom, bringing little new features that could have been done better using the old architecture, with a maintainer team refusing to do release for a long periods of time or favoring some applications over other (which is totally unacceptable), not to mention not-thread safe, CPU hungry in many reproducible situations...
- PolicyKit is complex, using a very important number of processes, is almost never correctly initialized (only gdm3 seems to be able to do it) and breaks many setups, notably Network Manager... I now have to use command-line on KDE to connect to a wifi... And you cannot install Gnome3 or NM without it anymore...
- KDE4.x was not usable before 4.3 (I am actually ok with this), but still on 4.6, I have to deactivate the semantic desktop and all strigi to stop sucking a lot of my CPU power. Network Manager still does not work and I have weird kwin crashes with the nVidia proprietary driver.
- less important and less annoying, PackageKit is also a very complex thing, requiring maintainers for most distro to patch a lot of code, that has very little needs but quite some work has been pushed...
- Unity and Gnome3 have huge usability regressions, so far, but I will not emphasis on that until the next versions are out (KDE 4.0 and 4.1 were no better). But they also are very broken. For both of them, the WM doesn't support correctly application fullscreen, mixture of x11 and OpenGL, and of course not correctly Xv. Accessibility has been forgotten from Unity. On top of that, Unity crashes a lot or can trigger infinite loops; my family were quite not happy when they were upgraded.
So yes, when people ask my opinion with systemd or Wayland, I am not optimistic.
However, I have absolutely no problem with printing :)
1. Glorious design
2. Shitty first pass, released anyhow and made a standard
3. Laboriously fixing it over the period of years until it converges on "Slightly-but-only-slightly less sucky than what it replaced"
4. Goto 1
has become the standard operating procedure of the Linux desktop world.
I am not convinced that the Gnome and KDE projects are good organizational strategies. Replacing a lumbering behemoth accountable to its many customers with lumbering behemoths accountable to essentially nobody is not an improvement, no matter how "open" the latter may be.
(Oh, and I was happy only because I'm a professional programmer and I still prefer the command line over any "semantic desktop". I freely acknowledge this is not the common case.)
My favourite anecdote about linux's degradation is this:
Earlier, applications like nautilus , cdburner etc would accurately predict and show the amount of time for operations like copy to complete. Moreover, copying stuff via the GUI was not significantly slower than command line.
With the latest versions of KDE (and even gnome), we've gone back to windows98 levels of inconsistencies (seeing the kde time-estimate bar during a file copy is just painful).
If you want a distro which doesnt take 1G of ram to run a filemanager, the latest one you can try is Ubuntu 7.04 - runs much faster on my 6 year old laptop (with 200MB ram) than my modern desktop (with 1G ram).
Patrick's emphasis on pragmatism & performance means that he simply wouldn't have incorporated stuff that didn't work if he could avoid it.
While the issues mentioned in the article are valid, it's still so much easier for me to maintain all family (and several friends) pc/laptops around, after I migrated them from Windows to Kubuntu/Xubuntu. The desktops just work and don't break, which is not something you can say about the Windows tamagotchi thing - you need to care about this one, or keep it restricted.
Also, 2-4 cores are standard these days. 8 - 24 cores would be high end.
Having an opinion about the miserable state of the Linux desktop has nothing to do with open-source advocacy. The success of many open-source projects including Linux itself (on everything but the desktop) is proof that the model works, and can produce quality software, often better than the proprietary alternatives.
I too am an open-source advocate, and I've met insult and disdain when I rant about the desktop "holy cow". There's just something there that makes people blind to the issues, and it's the same thing that makes the issues pile up and not get fixed. It just doesn't happen very often now because I seldom give enough of a crap about this long-lost battle to even enter the discussion.
And a big part of the difference is the environment:
On the server you have a relatively homogeneous hardware platform, many deeply technical users who can and will fix things that are broken, and good connectivity of related environments so that useful innovations spread quickly and the people who made those innovations possible benefit from their spread.
On the desktop you have a much less uniform hardware environment, relatively sparse numbers of technically able users, multiple obstacles to connectivity, and almost no individual recognition for people who make important improvements.
And the problems that are being solved are different, desktop software usually has to put up with a lot of bureaucracy and rules; writing desktop applications requires that developers be aware of the conventions of the system they are operating in and play well with a host of other software for something as simple as drawing a dialog box for the user and capturing input from it. On the server it's relatively easy to partition processes from each other so that they don't interfere with each other; whereas on the desktop you have to support every conceivable action that someone could possibly take.
That said, Linux "winning the desktop" isn't necessarily a worthwhile goal. Don't get me wrong; having Linux desktops available is a good thing all around, it keeps Apple and Microsoft honest. And it's unfortunate that intellectual property concerns have hobbled the development of multimedia in open source to the extent that they have.
Do we really need more WIMP style desktop operating systems that are attempting to be familiar to both Mac and Windows users?
Shouldn't Open Source Desktops be a hotbed of innovation? A carnival of new ideas, metaphors, experimental interaction designs?
But if you look historically at most major open source projects, they aren't innovative. IMHO, most open source projects seem to stem from a desire to have a free (in both senses of the word) version of xyz proprietary/closed source software.
So we end up with lots and lots and lots of copies/equivalents to closed source software, only with a few esoteric additions for whatever itch the author needed to scratch and half the feature set gone.
Sometimes, the OSS version gets better over time, and even supplants the closed source version. But just as often, once the author scratched that itch, development stalls and very little innovation happens. Or alternately, we'll see dozens of entirely different reinventions of the wheel, where somebody didn't like the particular way the OSS version worked, and then just rewrote the whole thing again. I mean really, how many half baked personal finance software tools do we need?
In all of this, what doesn't tend to happen is novel innovation in desktop software. Open source games are rehashes of closed source favorites (I mean really, can anybody point me to a completely novel OSS game)? Desktop software tends to be free (as in beer) copies of expensive proprietary software (GIS tools, image editors, 3d modelers, laboratory tools) or non-user friendly academic research projects released into the wild. Taking gaming as an example, almost all of the innovation is happening in the closed source, indie gaming scene.
Basically, you can kit yourself out with less polished version of hundreds of thousands of dollars of proprietary software for free (that you can play with the code a bit), but that's about it.
OSS is sadly trapped in a cycle of non-innovation and has been for years.
iTunes and WMP are bloated crap that force you to sit through DVD warnings. WMP has gotten a lot more stable since Windows 7, but that's actually a case where it has taken Microsoft years to come up with anything approaching the usability of VLC. (And their fealty to the MPAA prevents them from taking the final user-friendly step and allowing us to watch what we want to watch on the DVDs we've purchased.)
Chrome is widely regarded as the front-runner among browsers. And to be clear, Google didn't make Chrome because they wanted a free (in both senses of the word) browser. They built it because they wanted a browser that doesn't suck. (Though I guess you could make the argument that KHTML was just a me-too product.)
I don't think Inkscape has any competitors, open or closed. It's an intuitive, full-featured vector graphics editor. Adobe Illustrator by contrast is certainly superior, but it has a learning curve that makes it unusable for people who aren't graphics designers. Inkscape is extremely innovative.
Other open source products I can think of with fantastic UX are Transmission (the BitTorrent client) and Colloquy.
I'm not sure that is a victory of OSS as much as it is that they have more freedom to play fast and loose with patents than a commercial/closed source developer would have if directly selling their software (particularly in the US market).
Most software isn't innovative, and open source samples from the same intellectual gene pool.
EDIT: This is more like what I used to look for back in the day: http://www.osnews.com/topics/1 There's a list of available topics at http://www.osnews.com/topics/
Only if you don't want them to be used. Which is totally okay, but it'll keep most people (myself included) far, far away.
Last time the Linux desktop felt innovative was the release of enlightenment 0.16. I had 9x virtual desktops and a completely skinnable UI with a reasonably robust scripting language running on some POS Dell laptop, and it was speedy.
You are right, and my comment wasn't very clear on this part.
It was just to avoid the comments saying that I am an open-source hater.
The true beauty of Linux is the way that the OS improves over time. Problems are ironed out, and legacy hardware support is often ongoing. I think this is unique to Linux.
Decisions about where to go next re. Desktop paradigms are tricky ... but I think the main problem is lack of strong and decisive leadership; largely due to the problems inherent to an open development hierarchy. A paradigm shift often requires a leap of faith. Without that, any changes becomes a patchwork or amalgamation of multiple people's ideas - which in my mind isn't optimal.
I reckon Mark Shuttleworth is trying to create a paradigm shift with Ubuntu .. and I applaud him. There's so much life left in Linux as desktop platform. To state otherwise is shortsighted, and ultimately damaging to an incredibly vibrant and worthy platform.
I'm sorry, but the statement that linux improves over time, and thay's unique to linux has really no basis in reality. Both Windows and MacOS do improve over time, we could argue which of those three does the best job, but I don't really think linux would be the winner in that comparison either.
I don't have any experience with other OSes than those 3 mentioned, so somebody else can chip in.
I think an open source model encourages iterative improvement which isn't tied to commercial product releases.
This was the basis for my comment.
Your original comment was a little bit ambiguous: you were primarily thinking of legacy hardware support as in "legacy hardware support is often ongoing". I read the first part of the sentence "Problems are ironed out" as if that's only valid for linux, which I don't think it is.
Now try running Linux 3.0rc on, let's say, an aging SPARC 5 workstation. Just Works.
But hey, I use linux for the last two years for all my desktop computing (and more for servers). I love it and find it superior to windows. Except my bluetooth doesn't work: and it did with windows.
C'est la vie, I guess...
Believe me, they are not. They might be history for you, but not for everyone.
Many non-standard setups (using Jack, using pass-through, using bluetooth headsets) are still broken. Applications using Phonon have very different behaviors with or without PA.
KMix and pavucontrol still behave very differently from each other...
There are also the hurdles to jump through of signing up for a bug tracker account, searching for existing similar bugs, watching my report get ignored because it's not an "average user" use case, etc. For someone like me who's experienced with ALSA, written a sound card driver, and had things "just work" for a very long time, there's greater incentive for me to just rip out PulseAudio and get back to work.
Gnome3 and Unity, I leave them the benefit of the doubt, because they are new, but GNOME people are usually the ones who are pushing those technologies.
And I tried out 'airfoil' the other day (I think that was the name) that let me push out audio from different apps on my mac to 'foil speakers'? - basically other software running to capture and playback that audio. I can essentially run and manage just the audio portion of itunes on mac 1 from software on mac 2. And it 'just worked' (to reclaim the ubuntu mantra). Ahh - here: http://www.rogueamoeba.com/airfoil/mac/
I hate how it keeps touching the volume level for the system/hardware.
Power management with linux on laptops has been going downhill since sometime and today I get atleast 30% less battery life as compared to Win7 (maybe Phoronix is right with its power regression hullabaloo)
Suspend and resume still sucks (https://bugs.freedesktop.org//show_bug.cgi?id=28739 presumably fixed in 2.6.38 but still gives me problems frequently).
Displayport/HDMI out on my laptop does not send out audio and until I used a custom PPA kernel, multi-monitor out on my laptop used to lockup frequently.
All of these things work seamlessly on Windows 7.
I am willing to pay good money for a seamless Linux desktop (~100$ for something that works seamlessly and timely updates) - if that's what it takes to get someone to focus on something of value, like TuxOnIce, rather than placement of close buttons on the left vs right.
oh, oh.. jumps around excitedly. I know this one!
sudo apt-get install powertop
P.S. I dont completely trust Phoronix, but I'm beginning to think that the power regression thing is real http://www.phoronix.com/scan.php?page=article&item=linux...
Apple does it and that's why their hardware mostly works. (But to be fair, I've had plenty of issues with Windows and OS X. Software sucks. Hardware sucks. That's why you get a supercomputer with 15 hours battery life for $500, though. If you wanted it to be perfect, it would cost you $100,000.)
I imagine it's cheaper just to support the hardware you are using for other OSs though?
Dell were shipping Linux based laptops and presumably chose this route over the "create hardware specifically for Linux" route. Why did Dell fail in this respect?
Say you only support hardware from the last 5 years (Drop all pre-2001 compatibility.) - 5 major laptop makers,3-4 hardware/motherboard lines. Can a software player not raise 20K$ to buy hardware and build in full out-of-the-box compatibility ?
Diaspora raised 200K for vaporware - can this not be funded ?
FWIW, this is somewhat incorrect.
First of, Konqueror is an application, not a browser engine; you're thinking of KHTML.
Now, it's true that KDE is still developing KHTML and KJS (and it has to, considering KDE commits to ABI/API stability within a major release series and the Qt port of WebKit wasn't close to ready by the time KDE 4.0 was released).
(And yes, on my video-playing box, I use PulseAudio, and listen to stuff with Bluetooth headphones. Turn them on, audio switches to Bluetooth without the mplayer session even being affected. Turn them off, back to the TV's speakers. It just works. And I set this up by clicking pretty things in a GUI.)
As it is, the Linux desktop is still working towards "yesterday's experience," since we don't have good enough browsers and there are tons and tons of legacy apps. Unfortunately, supporting yesterday's experience is harder for usability since it's conceptually richer, and most of these concepts are dueling at the bits-and-bytes level, where they're hard to reason about and present in digestible ways. The browser can blackbox a lot of stuff behind the standard APIs, something which the OS does not enjoy.
Another problem, just as large, is that there continues to be lots of new hardware, and correspondingly new drivers to support and specs to reverse engineer(since many manufacturers remain unwilling to assist). This is a problem that can only be solved by maintaining a hardware taskforce roughly proportional to the amount of new hardware released, but it's not particularly interesting or sexy, and it's not a core goal of most of the Linux ecosystem to create premium-grade drivers for all hardware.
Now, I'll admit, I'm a minimalist. I use Arch Linux, run no end-user applications except Emacs and Chrome (and I use w3m as the browser often enough), dwm for window manager, and so on. But really it seems that the enormous complexity of these new solutions makes them near-impossible to support. PulseAudio, PolicyKit --- all of these had working systems that were far, far simpler. Maybe instead of PulseAudio, we needed to look at the ALSA code, clean it up, and add the few features people really cared about (per-application volume control) there. Yes, the semantic desktop might be this great pie-in-the-sky result. But really, too complex. Unity, Gnome3? Same thing. We don't need Wayland, we need to simplify and clean up X.Org (something, admittedly, that /has/ been happening. X.Org is the one of those software packages that /does/ improve monotonically with every release). We don't need the total solution to all our problems. We need to simplify and polish what we have now.
And, to be honest, GNOME 2 is still pretty lacking. I know a lot of GNOME team members past and present, and I like and respect most of them for a lot of work put in on a pretty thankless project, but the results aren't there compared to Windows or OS X.
As of December, I'm all Linux. I run Ubuntu at home. I use Linux at work, except when I need to debug OS-specific browsers. I never want to go back; Gnome hits my sweet spot. I want an OS that starts up and becomes invisible until shutdown, and I never want to touch the mouse. Ubuntu's out-of-the-box installation is probably still unusable for people without a tech background, but I love it.
I spend most of my time in Docs, GMail, Google Calendar, lots of Chrome tabs for testing + looking up reference material, Emacs, IntelliJ, and lots of terminals. I almost never need other software, and when I do, it's an apt-get away.
I've had challenges - I've never seen Ubuntu completely work on a fresh install. My wireless failed on my newest desktop install. The sound on the two before that. They never bundle my wireless driver, so I drag my desktop over to my modem. My printer gives baffling error messages. My old webcam arbitrarily rezoomed as it pleased. Gnome 3 overrode the "invert colors" keyboard shortcut, which I use heavily. I needed to steal time on my girlfriend's iMac to run Portal 2. My supercharged Toshiba laptop freezes when running on battery power (may not be Linux's fault).
But these are most of the problems I've faced in the past two years. I lose a few hours of productivity every few months in exchange for massive productivity gains while working.
- Utter inability to consume streaming video, be it Netflix, Hulu or YouTube.
- Complete madness wrestling with the built-in, broken version of Ruby.
- Real difficulty finding any clear, accurate documentation.
- Out-of-date crappy versions of libraries available through apt-get.
- Printing? Ha.
- I can't play Minecraft, which is a Java game.
- GMail freezing on load 3 times out of 5.
Rather than liberating me from the bondage of win32, it has made me feel like a third class citizen of the digital world. The consequence? I am buying a Mac.
Edit: I wouldn't usually complain about a downvote, but this is a list of difficulties I have experienced when using an operating system. What do you object to?
- Netflix, Hulu, Youtube, sounds like you don't have Flash installed, or no 3d acceleration (no video card drivers? What videocard are you using?)
- What's broken about the built-in version of Ruby? (I don't use Ubuntu or Ruby, so I'm sincerely curious.)
- Documentation for what? What are you looking for, how can we help?
- Apt-get, Ubuntu has a strict release cycle. To get the latest unstable stuff, you'll probably need to upgrade to the latest beta or alpha.
- Printing, I've had mixed experiences with this depending on the printer. I've given up on paper, can't blame you.
- If you can't play Minecraft, then you may have a weird version of Java installed, or no 3d acceleration (no drivers?).
- GMail freezing, what browser?
When posting complaints, avoid framing it in such a way that it's impossible for anybody to help. If no one can say anything about your experience because there's a lack of information, then people can't relate your experience to their own or debate the applicability of it.
Netflix doesn't work because of no Silverlight.
Hulu and YouTube are unwatchably slow when fullscreened. They work perfectly fine under Windows on an older machine. In addition, having a YouTube chrome tab open slows everything to a halt.
The built in Ruby is an older version, and it conflicts with any new version you install unless you use something magical like RVM. It's very confusing for someone new to the Ruby toolchain.
The installed Java VM doesn't seem to work with Minecraft, and getting it to work involves doing things that remind me of DOS gaming back in the 90s.
Gmail has been freezing in Chrome since updating to the latest version of Ubuntu.
None of these are killers, and the OS is generally fast and awesome. However, I have the choice between spending 10% mental overhead wrestling with my machine and context-switching to deal with bugs, or spending $2k and buying a Mac. Cumulatively, I feel the Mac will be cheaper.
I am 100% certain you can't get netflix to work on linux. Also, flash sucks on linux. I am tired of flash on linux. Asking what video card you are using is a cop out.
NetFlix needs Silverlight's DRM, which Moonlight on Mono doesn't have. On the one hand, it's not really fair to blame Linux for that but on the other the end-user experience is that you can't have NetFlix on Linux. It sucks, but that's the way it is.
EDIT: Had some issue posting.
I'm sort of curious what version of Ubuntu you installed. I've installed Ubuntu 10.4, the default install, on a number of machines and have had zero problems with streaming video (Hulu or Youtube). I've had issues playing encrypted DVDs, but I think I just need to install a codec or something.
> Out-of-date crappy versions of libraries available through apt-get.
This is an inherent problem with the very concept of a distribution, and doesn't really go away with OS X.
Generally if you want bleeding edge software, you're better off building from source no matter what the distribution, although you might be able to tweak your sources.list to use unstable repositories.
If you generally want the latest release of everything, then go with a rolling release distro. You can build old versions from source or through namespaced legacy packages.
If you generally want year-old packages, then go with an incremental release and build your own version of those things you need the latest release of. I have found this to generally be more effort, but I tend to want the latest compilers and base libraries (not least because I want issues like reduced header promiscuity to cause breakage on my machine first instead of handling the support tickets when it breaks on a user's machine).
- Youtube is horrible for me to watch in Germany, but that's the carrier's problem.
- I'm a ruby user, but not a developer, so I can't tell much about that.
- Documentation depends on what you're searching for. Ubuntu itself provides a great set of documentation and community wikis. The forums are also very helpful, so is askubuntu.com.
- As said: Ubuntu follows a strict release cycle regarding the library versions.
- Printing works perfectly fine for me with any printer I tried so far. If your printer does not work, complain at your printer's manufactorer.
- Minecraft runs perfectly for me.
- Gmail works perfectly for me.
I think it's probably true that if you're not somewhat 'nix-aware, there are still gotchas here and there -- but I've found a quick google search almost always comes up with some good suggestions, one of which usually fixes up whatever I'm complaining about.
Compared to the downtime I've had with viruses and malware on Windows, the constant need to be vigilant and to teach (or fail to teach) my less technically inclined friends and family how to protect their machine, Ubuntu is clearly the winner.
That's the same reason why I love Ubuntu as my desktop. I'm hardly doing anything outside of the browser, except for coding and writing with Vim.
> I lose a few hours of productivity every few months in exchange for massive productivity gains while working.
Even though upgrades aren't that frequent on other platforms, they do exist. Every two or three releases I take the chance to make a fresh start and it's incredible how fast I can set up the desktop to fit my needs. Ubuntu's/Debian's excellent package management is a huge time saver.
I've been using some flavour of Linux as my primary OS for over three years now. Windows 7 seems like a big improvement over previous version of Windows, but I have driver issues like you wouldn't believe, whereas Ubuntu works flawlessly on first installation on the same hardware. OSX is nice, but it's just uncomfortable because it offers a lot of what Linux offers, but in a rigid, UI-driven way that you'd expect from Windows.
For technical users, (technical users like me, at least) Linux is still where it's at.
My install experience was pretty flawless. I have moved 2 desktops and 3 laptops over to Ubuntu(10.4, 10.10 and 11.04) and have not had any real problems except for a 3d acceleration issue on a 10 year old ATI video card. Been going for about 6 months this way and really find it painful to work in Windows now.
My work desktop is a simple two monitor setup with xmonad, several urxvt terminals, vim instances and two browsers of choice (firefox and chrome). I love it because there's no distractions, only code is visible and browser when I need it. I even use mutt with Exchange (a very perverse setup through davmail, thank you corporate policies) so I don't get notifications for emails which would distract me from programming.
But in defense of Linux, it has come a long way over the last decade, due to a few earth-moving forces:
- The rise of the web and end of the Windows native app lock-in
- The mainstream acceptance of open source and platform independent software engineering
- Canonical's aggressive, financially backed push to make Linux a viable mainstream desktop OS
It's easy to forget how many classic pain points have been more or less cured: building everything from source, recompiling the kernel, font rendering, working with office documents, configuring everything manually by editing text files, complete absense of tasteful design, no cross-platform apps or games whatsoever, and nobody having ever heard of your OS.
Linux is not there yet. It's still perfectly reasonable to throw your arms up and switch. But at this point, the historical trend suggests that it will get there eventually. It has never really stopped inching forward and we haven't yet seen anything that could stop it.
The Linux ecosystem doesn't burn through capital, so it can't collapse overnight like a commercial platform can. All it needs to sustain it is interest. The interest could dry up, for example if everybody starts caring only about data and not code. But if nobody's interested in it anymore, then it probably doesn't matter if it goes away.
But it does have to adapt to the changes in the technology landscape that are happening right now i.e. "post-PC" devices and "big data". These are the biggest changes Linux has ever seen and, unlike the web, they don't particularly work in its favor.
I just can't see ever moving back to Windows because I couldn't stand to lose the shell. I could see myself moving to MacOS, but I don't understand how a sys admin could stomach doing so if I could not.
"Try _distro_ - everything just works for me!"
"research before you buy!"
The 'it all works for me crowd'. I don't believe most of you most of the time. "Works" is in the eye of the beholder, and most of the time I've found that people who make that claim don't have anywhere near the same work style that I do.
Example: running audio from 4 different apps simultaneously all mixed together is a no-brainer under OSX/Windows, but is something most Linux users don't even dream of, because they've never even been able to try it (whoops, amarok is running, so I can't get AIM sound alerts, etc.) If they've never done it, the "works for me crowd" can't possibly know what 'works' means for them is light years away from what 'works' means to me.
"Research before you buy". Where and how? Show me an up to date list which shows even most of the compatibility aspects of stuff I can go buy in a store against recent distros. Googling around and finding that I can get Ubuntu 5.04 running on hardware from 2004 does me absolutely 0 good. Manufacturers don't always show you the exact chipsets used (and may often change chipset revisions and just ship new windows drivers in the box), and yet even if you do happen to know every single internal spec of hardware before you buy it, no one else will have bought it to determine if it works. Someone's gotta be first, unless the manufacturers would actually start certifying that "distro X" runs on particular configurations (which ain't gonna happen).
The majority of us live in a world where when we want a computer we're going to go to a local store and buy something. If Linux users want their stuff to be more widely adopted, they need to adapt to that reality. I suspect most Linux users don't really care, and are happy being 'cutting edge' and compiling new drivers to test stuff out. Really, it's an exciting and interesting way of life for some people, but one which I and many others eventually outgrow.
honestly, this just works out of the box for several years now (on ubuntu, debian and arch linux at least).
The manufacturers have started certifying that Ubuntu runs on particular configurations.
For instance, here is the certification report for the Lenovo X220 that has been mentioned several times in this discussion.
I used it almost exclusively until 2006, when I finally had enough. At the time I moved to Windows and abandoned Linux on the desktop completely, but I've since switched to OS X.
I can relate to the author. I spent years messing around with my systems, which was fun for a while. Every 6 months I'd reinstall my desktop box with the lastest version of Red Hat (and then Fedora) and spend the next week or two filing bug reports in bugzilla, and patching some stuff.
But after a few years, what was a learning experience, becomes a permanent wound in your sound engineering sense. It just isn't possible that something that has an insane amount of bugs with every new release will ever be something reliable and usable. Not with so many regressions, never fixed problems, and unbelievably in-your-face issues.
I've tried to return, to see if any progress was made. About once a year I install the latest Ubuntu or Fedora, but every time it seems the same, or worse in some aspects.
My last test was Ubuntu 11.04 on an old Toshiba laptop from late 2005 that I still have. It used to work apparently fine with just the LCD brightness control not working, and later with the usual application quirks, bugs and crashes.
However, the latest version has "new" drivers for the ATI Radeon X300 graphics card, which non-working 3D acceleration that makes the default WM hang on login, and not even "glxgears" works anymore. Any attempts to disable hardware acceleration have failed (not only X doesn't have a configuration file by default anymore, but it also segfaults when trying to generate one with "X -configure" and ignores any option I set manually).
Aside from the X problems, stand-by no longer works. The machine just dies never to wake up again.
The problems just go on and on.
It isn't a usable machine for anything other than simple web browsing, and even there it has issues with flash, which stutters playing the simplest of videos, even though it doesn't peg the CPU.
I've used the knowledge I accumulated from these years for my work (where I have a bunch of Linux servers working just fine), but I say that Linux on the desktop won't ever amount to nothing, no matter how much that turd is polished.
It's even more sad if you think that while all this was going on in the Linux desktop, Apple has successfully launched an unix desktop that has been stable and usable for years. So it is possible and in a short timeframe, which means that Linux on the desktop is a failed experience.
You're comparing apples with pears :)
Apple design an OS for a supremely limited range of hardware. Your comparison isn't fair.
In 1998, with a much more immature Linux and X11, it was in some ways ahead of the competition. I remember looking for a free computer in my universities' computer labs and when I had to settle for one of the old and slow 486DX machines, being able to run Netscape remotely on one of the Pentiums being used by another student just fine, fast for me and unoticeable for the other guy.
Windows NT was a joke back then. It has almost no hardware support. When Windows 2000 came out there were even graphics cards better supported on Linux.
But from the Windows 2000 lineage we now have Windows 7, even though Microsoft stood still for so many years with Windows XP.
Meanwhile, Linux gained eye candy, when you are lucky enough to get it to work. The desktop evolved, but Linux did not, and the gap is widening.
BTW: today the decent graphics card marked is mostly ATI/AMD and Nvidia. OS X supports a large number models from both vendors (even those that were never shipped by Apple). How does that fit with the notion that Apple has a more limited set of hardware to support than Linux does?
While I agree generally that a lot of boneheaded moves occur on the Linux desktop (like Pidgin->Empathy), I don't think berating hardware support for not being equivalent to Windows is fair. Also, it's usually wrong; I have always had to install drivers every time I have installed Windows in the last five years, but I have only had to install drivers in Linux on a few occasions (MadWifi back before ath9k was around, Broadcom wireless on a friend's Dell laptop, uvcvideo before it was included in the kernel).
Which features of the 'evolved desktop' does Linux currently lack?
EDIT: it seems you've edited your text, and no longer speak of an 'evolved desktop', so perhaps my question isn't relevant anymore.
Reliability and usability. Pretty important features, IMHO.
I have no issues with reliability or usability on my current Linux desktop.
How many more reimplementations of basic functionality on the major desktop environments do we need? It seems like whatever the current window manager is, and however well it works, it'll be thrown out and a new one will take it's place, and take a few years to do what the old one did. We'll throw out the audio stack again at some point. X isn't good enough, so we'll throw that out. Since Wayland isn't network transparent, there will be twelve different ways to add it back in and five incompatible ways to run old X applications on top of it. When those aren't sufficient, we'll throw it out again and start over, ad nauseum. And this applies to most things on the system that are "good enough".
Linux on the desktop is stuck in the 90% category, where 90% is implemented (and as we know, this represents 10% of the work). It's always tempting to avoid the unsexy 90% of the work and write from scratch. The end result is throwing increasing amounts of the baby out with the bathwater.
I really feel natty is heading in the right direction.
I want a linux laptop, with good battery life, stable suspend/resume, solid CPU performance, 3d graphics sufficient for light gaming, and a high-quality fit-and-finish. Which one do I buy?
Edit: Installed neverputt on this machine. Works fine with the settings maxed out and the resolution at full screen.
And yeah, it's a complete tank. The only downside I have is that both models seem to have issues with Sandforce SSDs and resuming from sleep, but that's a hardware issue instead of an OS one.
Also, the mobile broadband does not work under Linux, despite the fact that the datasheet for the card says "Linux is fully supported". It will show up as a serial port (well, four of them) after you edit the kernel source (sierra.c) to contain this card's USB vendor/product ids. But without a firmware loader and firmware, you can't actually use the broadband. I feel that this will be fixed soon, though, as the Gobi 2000 is quite popular for Linux and this is the Gobi 3000.
The GPS does work.
If you feel adventurous, burn a copy of Ubuntu on a USB stick and go to a computer store. Last time I did that I promised the salesman I wouldn't harm their display computers and the guy let me boot a lot of them under Linux so I could check what worked and what didn't.
I'll soon replace the Acer with a Vostro v130. It's small, light and has a configuration that comes with Linux preinstalled. I'll reinstall it immediately, but it's good to know it works.
Two months ago, me and my father spent two weeks to find one decent laptop for my sister member that was not crippled by a crappy screen, poor battery life, atrocious build quality or weight. And that was meant for a Windows system. We both have jobs, so if we had billed these hours, we could have easily got some really expensive gear just for that.
Adding to that the effort spent for researching hardware compatibility and configuration... This is just nuts.
To come back to the point, I want something that just works. I used to use Linux for several years. Like many others, I ended up with Apple at some point. I simply found out that I preferred to spend money on something decent over wasting time to find a cheap deal.
I recently switched from a Mac and I definitely miss it at times but there is enough in the thinkpad/ubuntu combo to hold me over for now (openbox, good tiling managers, source code that is open to scrutiny etc).
If you do want to make the change, you might as well get used to the disparity now ;)
It is totally fair, because each person has to decide whether they're going to use Linux or MacOS.
Saying it isn't fair is basically saying that MacOS is and always will be a better desktop choice, because Apple picked a solvable problem and the Linux desktop community didn't. I don't know that's true, but that's definitely what "not fair to compare Linux and OS X" means.
That's a big qualification.
I think you're either trolling or are genuinely misinformed. If it's the latter, please accept my apologies.
Both machines are fully supported by Linux and the hackintosh is supported by community kernel extensions. Getting linux up and running for the htpc required a lot of fiddling with config files... adding refresh rates to X manually, custom ALSA configs to enable the audio device to show up correctly, accept the correct sample rates and stream 5.1 properly, adding extra PPAs to get video drivers that support one of the various forms of accelerated video decoding, scripts to shut down gnome when xbmc is running because it interferes with rendering for some unknown reason, messing with launchd because it launched programs that rely on all filesystems being mounted before they were mounted... that's just off the top of my head.
The hackintosh required a boot cd and a package for drivers that weren't natively supported. Once the drivers were installed, it worked. That's it. Everything that took days of tinkering and forum searching on linux just worked.
I have years of experience with linux and this was my first time trying OS X on an unsupported machine. So no, I'm not misinformed and no I'm not trolling, I thought HN was supposed to be civilized :/
I don't see how you can make a categorical statement declaring that as long as drivers are written for OS X, OS X is more reliable than Linux.
Hackintosh != most people's experience of OS X.
Your experience will be particular to the hardware you chose, and the implementations you chose to follow.
In the same sense, I wouldn't declare that Linux is more reliable than 'X' because again, there are far too may variables involved to make a fair comparison.
I think a far more pragmatic approach is to accept that Linux can provide a very robust desktop solution - if the necessary investment in time and research is carried out. Largely, imo OS X is successful because this research has already been carried out on the users' behalf.
(I still have Linux servers and a Windows box as a hybrid gaming box / htpc / NAS but that should be replaced with an Apple TV, Drobo FS, and Xbox 360 or something)
Apple got the huge advantage to have control over their hardware, which I believe makes all the difference.
It would take years of this attention to detail to catch up to MacOS though.
The laptop has worked well with Ubuntu 7.04 through 8.10 and Fedora 11 through 15. The main problem is that sometimes my keyboard and trackpad do not work after I resume from suspend.
However, every web app that I've deployed (dozens) always had/has Linux on the server. For me, I nor the teams I worked with would consider anything else.
However, for the desktop, I gave up on that in '02. Went to Mac, never looked back. Linux server succeeds because it doesn't require any UI pleasantries. Desktop is a completely different story because it does. As much as I support the FOSS movement, I just don't think it has what it takes to make a compelling desktop. Ubuntu and others can copy OS X, but I don't think they'll ever make a comparable, competitive alternative.
What most people probably don't realize is that Apple is a laptop company. No one in the industry can match their industrial design. If you set down in front of anyone (who cares about their tools) Brand X laptop or a Mac laptop - it's no comparison. For me, software is my livelihood and I care about my tools and plastic junk is plastic junk.
This stretches out even further though because all those hardware decisions inform the OS decisions. It's not like, "yah, just use this ethernet chip" or "whatever, just get decent battery life, use whatever cells you need". Everything is considered. Every millimeter, every curve, every line. If a custom battery design is needed, then it is and it's made specifically for that device. If they don't have enough room for a typical laptop cpu, they go to Intel and ask them to make them one. No one else in the industry does this.
Ultimately I think these are the critical things. Design 100% matters. I recently work at one of the most prominent design companies in the world and I saw it all around me every day - they designed things for a reason and when you used the prototypes you were like, man, that totally makes sense. As a software developer, I'll never forget that - it reinforced my appreciation for things well thought out and it made it easy for me to deliver beautiful and well enjoyed products.
Lenovo, Dell, HP, etc - those companies produce things that are just 'made'. No one at any of those companies says 'make me a machine that looks like a flower' or 'how can we do something completely different?'. Their bottom line is the bottom line, so the cheapest materials go into every machine 'made'. This philosophy permeates the entire organization. So how can the 'greatest' desktop be 'made' for machines like this? It can't.
For servers where design doesn't matter as much, Linux is 'it'. It's fast, stable, gets out of my way and runs all the components I need. On the desktop, though, it's hobbled by hardware people that just don't care. Linux desktop needs someone to care in order for it to be a contender for the hearts and minds of ordinary users. Less is more in this case and a small group of people that can make design decisions for the future is what's needed to make Linux desktop something people would want to use. Something better than Windows.
But, to catch up to OS X? I don't think so. Especially now since it's being influenced by iOS.
Quickly transitioning to a computing appliance company.
But, to catch up to OS X? I don't think so. Especially now since it's being influenced by iOS.
Linux has an opportunity to get the power users and geeks that Apple is seemingly abandoning with the iOS merger. In 4 or 5 years I don't expect Apple to be selling consumer-level hardware that has an accessible command line or filesystem or the ability to run unapproved apps. That would be unacceptable for me, so I'm hoping that desktop Linux will be a viable alternative. I try Ubuntu every few years, and it's not there yet.
As long as individuals continue to develop apps for iOS, I don't see that happening. Maybe they'll be behind an "Expert Mode" toggle.
However, with iOS5 Apple have probably done themselves out of a further Mac sale from me - between the two of them, the iPad and Linux will cover my needs.
I used to enjoy tweaking config to get dual monitors to work etc, but now I'm sick of it.
Lately Ubuntu has started screwing up the panel config everytime an update comes. Grrrr!
But I think this is more about me getting old, slow and lazy rather than Linux getting any worse. I mean now, with Ubuntu it really does pretty much work out of the box, even on my laptop (and yes, suspend/resume works).
Also, it's true that as you grow older you value time more. And that's why I use linux in the first place. I am faster with it at pretty much anything. The only thing that really drives me mad is skype but my wife has problems with skype also on her windows and mac computers so it's not linux to blame.
That's why there's a control panel for this. Or you can type "xrandr --output VGA1 --auto --left-of LVDS1". As long as you don't buy AMD GPUs, it all Just Works :)
And in 9.04 I had to keep a piece of paper with the commands to reinstall the closed-source NVidia drivers every time the kernel got patched. Apparently I shouldn't have needed to do this, and yet every single update for months would drop me back to the shell after X failed to start.
When I was younger I'd have tried to work out what was wrong. Now I just sigh and are grateful I wrote down the package I needed, and the flag to get it to work from a non-X environment.
But yes, your point is noted.
The irony is that I know it's no use to complain to people still "inside" the linux community about this, because I used to be one of them and I know what I answered to those people that complained: pick better hardware (so much choice), pick better software (even more choice), learn how to use the software (you luser), learn to build your own (with this easy 23 step guide), and finally the conversation-ender "works fine for me" (so your experience is irrelevant).
It was of course all a delusion on my part. As much as I claimed that it worked fine for me, it didn't. Not really, not 100%. So it became a trade-off, which deficiencies were worse? Those of windows, those of the mac, or those of linux? I ended up on the mac, but it's also not 100%. It still sucks just a little bit. All I've ever wanted is for there to be one OS that doesn't suck, and I'll pay top dollar (or euro) to use it, but I've never found it. I always have to pick the least sucky one, and that really sucks.
I bought a new computer past summer and all its hardware worked properly from day zero, ethernet, wifi, 3D acceleration, suspending, hibernating and dual-monitors included. I chose the right hardware. Flash support? The 64-bit flash plugin is not crashing here. I use flashblock to avoid possible security problems, but I use it in Windows too, to avoid annoying ads with sound. The quality of OpenOffice.org or LibreOffice is good enough for desktop users, in my humble opinion.
And the rest of the post is debatable at least. Linux and its software ecosystem are not to blame for poor Skype support, for example.
Desktop Linux is really for people that can/will line up all their ducks in a row (and then sometimes debug those ducks) when upgrading hardware. Even as a technical person I can understand if you don't wish to continue to pour time into that sort of thing.
I saw a reference in a list about a wireless card 'model644b' that worked, so I bought it, then found out that, hell no, you got 'model644br4', and only r1 and r3 really work, as r4 was manufactured in a different facility and they used a different process (BS like this). The box and listing doesn't show the hardware revision number to the public - you have to open it up to find that out. (I made up the model number, but had pretty much that same issue in 2005).
I took an ubuntu boot dvd to several stores wanting to test out compatibility before purchasing a laptop, and was ejected from a couple. TigerDirect staff basically ignored me, whether out of respect for my mad testing skillz or just apathy - I couldn't tell, but I bought a laptop there in 2008 because I think I verified the wireless worked out of the box.
When the answer to 'linux on the desktop' is 'go research loads of conflicting or hard to find info, the order from an online store and piece it together yourself or pay loads extra for a custom built laptop' to guarantee, no thanks. That sort of 'freedom' isn't for me.
If vendors provided and honestly advertised Linux support, that would be a much better world. Then compatibility can be checked at the store. It'd be bad, though, if they supported Linux like they do Windows - with a gig of userspace junkware.
While not optimal, virtualization can be a solution in many cases: Using a 64 bit Linux server, a 32 bit Linux desktop (in regards to Linux I don't quite see the benefit of 64 bit at this point) and your host OS of choice is becoming encreasingly practical with memory prices getting lower. Better than throwing out Linux (with the many great hacker -friendly features it has) or having to select an older notebook just to run it.
(I don't disagree with much of what you are saying, just wanted to add another angle)
I know that Linux so often makes you put up with compromises and so often it is at the graphics card hurdle that it starts to annoy, but I think the 3 things he'll miss is the things which will eventually drive him back.
Firstly, the shell is one thing that seems irreplaceable, yes you could use cygwin or equivalent but nothing will fill the void of a shell that is so beautifully incorporated into the OS, more than OSX in my mind, despite them being the same shell, Linux makes no excuses and is proud of its Shell.
Transparency, Yes this is important, but something I know I could live without, as he mentions he doesnt do this as often as before and really its main purpose was for fixes with drivers.
Package Management and I would stretch further to even open source and free software will drive him back. I had a stint where I could only use Vista for about 7 days, the reminder of the horrors of "shareware" software and 30 day trials had me looking forward to getting back to "apt-get install"
Cinch, SizeUp, and Divvy provide all the tiling things I need for window management and they unobtrusively just enhance my normal environment, it's great. I don't miss xmonad at all anymore. I miss dpkg and apt but homebrew is good enough and I regularly fire up Linux VMs if I just need to, vagrant is awesome for this. (vagrantup.com)
I run NVIDIA graphics and have no problems .. I realise the latest and greatest takes a while to become fully supported on my platform of choice, and often solutions to problems require research and time - but I'd rather accept this compromise than use Windows.
Windows feels like Fisher Price, Linux feels more like Mechano - in the sense that it presents opportunities rather than fully packaged solutions
I blame the lockup/windowing issues completely on Unity and I've had more success with the system since I've switched to the classic Ubuntu desktop. My current Linux desktop is running 10.10 and it will stay that way for the foreseeable future (perhaps until 11.10). I have a new sandy bridge desktop coming in next week and I'm undecided as to what I will be installing on that system.
Unity isn't perfect yet, but I'm confident the UI team are heading in a decent direction.
Regarding the upgrade itself, yes - it's as easy as usual (click, click, click, .... done).
Makes as much sense as switching to OS X. The Mac OS is not Linux.
If you're developing for Linux, or using something like GHC where the OS X version is a least-favorite stepchild, or if you need package management as capable as APT or the FreeBSD Ports Collection – then if you aren't running Linux as your desktop OS, you're most likely running it in a VM, regardless of whether your host OS is OS X or Windows. In fact the author explicitly says he's doing as much.
So "somewhat but not quite like Linux" isn't something the author would strictly need to look for in his host OS. Intead he can take other factors into account, such as reliability, performance, available applications, and hardware choice, quality, and cost.
Sure, but it's much closer than Windows. OS X is a Unix and things written for Linux will often compile on OS X without significant hassles. Windows has some optional layers to do this but it's not the same as running a Unix through and through.
The VM route is an appealing one though and one I use myself from OS X. From what I've seen virtualizing Linux under OS X is a more seamless and pleasant experience than on Windows but perhaps things have come along there since I last looked.
But, still no binary packages.
Finally I installed Ubuntu and I've never looked back. Nearly everything just worked, with the exception of plugging in an external monitor. The experience was much smoother than on Windows or Mac. I'm very happy with the Unity desktop.
One thing that has me fed up with proprietary software was that although I had purchased Photoshop CS5 for Mac, it was nearly impossible to activate. I had to call Adobe because I was upgrading from a Windows CS2 to a Mac CS5. Once I had to reinstall everything and the installer asked me to go through the whole process again. With Ubuntu I can download and install software almost effortlessly. It's not worth the hassle to spend lots of money and not even be able to use a piece of software. I'd rather use Gimp, despite some UI weaknesses. The only thing that made me want Photoshop in the first place was better compatibility with other Photoshop users' files. For myself, Gimp is more than adequate.
Since going to Ubuntu, I seem to never have trouble with applications working, Internet connectivity, or graphics. I have two computers a desktop circa 2006 with an old ATI card (x1900, I think) and a dell mini 9 both running the Ubuntu 11.04.
I certainly have troubles with Flash, but they are infrequent and usually only necessitate killing Flash from Chrome's task manager.
On the Dell Mini, I occasionally have wifi issues, but since the 10.XX series, the frequency of these problems has dropped significantly.
As far as changing to Unity, I honestly don't care. I've been using Gnome-Do since around '09, so ditching the menus hasn't bothered me at all.
I have friends using Ubuntu as well and they don't seem to have insurmountable problems either, although they have been migrating to OS X in search of a more beautiful user experience.
But going to Windows? I couldn't imagine getting any development done on that OS.
It should be obvious to anyone who's been using Linux for a while that a new dual intel/nvidia gpu mobile card is not going to be sorted out until quite a while after release. If he was a newbie I'd feel some sympathy but he's not. The rest of his rant is about the past.
I've been using Linux for 6 years, the first year I was dual-booting with Windows because I had a project in asp.net, but when It was finished I formatted that partition and use it as a backup storage now.
I have a HP laptop with everything Intel on it, from WiFi card to Graphics card, and everything is supproted out-of-the-box. With plain Ubuntu Live CD I can have 3D desktops and connect to WiFi networks without a problem. For multimedia Linux is strong, Flash works without problem, although if you're on 64bit it may use your CPU more than it should.
The only problem which I have and the blog post author has is Skype and Office Suites. Skype works OK but it's not up-to-date as in Windows or OS X, and LibreOffice (or it's older dying daddy, OpenOffice.org) is OK but not perfect.
As for "Poor quality of desktop apps" point, I don't use Nautilus but Dolphin, and not Firefox but Opera, and I'm happy with it.
The point in the Linux Desktop is that you don't have to stay on the default Linux distributions choice, but you can customize/change/mix everything.
And for hardware, before buying a new machine, check if everything will work out of the box on the distribution you desire to install on it later.
bye bye past 'hardcore Linux user'
P.S. There are numerous posts where the author recalls on past problems, so, they aren't really problems now.
So this guy has tried two advanced distros (Gentoo and Arch), plus one "beginner" distro which (from my experience) has provided nothing but frustration, and he's giving up on Linux because it's "too hard"?
How can you claim Linux to be "too hard" when you haven't even tried Ubuntu?
Blame falls on Intel and AMD who use Linux really just as leverage against MSFT and never have given 100% commitment to it (e.g. Intel GMA500) except on severs.
The world has moved to laptops - Microsoft still gets money for every Linux machine sold and almost none are bundled with a Linux OS that works(I once bought a MSI Wind with Suse that hand an incompatible wifi card!!). Even Dell has back-peddled.
The main failure I believe has been Taiwanese hardware companies in their fear of Microsoft or their greed almost never advertise "Works with Linux".
I agree with the other poster - you would never buy Apple equipment without looking at compatibility beforehand - why not Linux.
I watch kids using Linux desktop and it is all about using browsers. Chrome and opera work very well under Linux. Tablets - are they the future of the linux desktop?
The poster neglects to mention that that the Linux desktop has moved to the hand held - android, meego maemo etc. Where the user experience was actually very good. Driver support is better than windows.
I found I simply have better things to do than debug and write drivers (yes, I did write drivers as well). I'd much rather expend my energy elsewhere.
Since the switch I never looked back.
My experience over the years brings me to the conclusion that the driver experience will always be lacking. While many experiences on Linux have improved, the driver experience has reached some kind of steady-state. Every time I upgrade my distribution, something improves and something else breaks.
>8 years of experience and you didn't know that you should look for compatibility issues BEFORE buying? Seriously?
I for one gave up hope of "The Year of Desktop Linux", a very long time ago and can agree with the majority of your points. But really now? :| That's something you should well be used to dealing with by now
As for Skype and Flash, what is the Linux community expected to do? They are both proprietary: there is nothing "Linux" can do to make them work better.
Then you have Fedora. It has always been the "experimental" branch. Much like Debian Sid, you don't expect it to work 100% of the time. I am surprised someone experienced with Linux falls in this trap.
I hope he enjoys the life without really knowing what the machine is doing, the life without a decent shell, without a decent development environment and managing software like it was in the dark ages. I wonder how long it will take until we see a rant about how useless Windows is and why he's moving to BSD or OSX.
That's basically it. It's not the drivers, it's not the bugs - it's the culture. The culture of Linux seems to be that every user should be a programmer and should help debug the OS. If you are one, and want to help - great, go ahead! As for me, I have other things to do and I expect the OS, shell and file manager to Just Work.
1. Excitement. Compiling the kernel? Cool! Configure an SMTP server just for fun? Cool! Having "fun" for hours fiddling with configuration files to make something work? Cool! Making a sound card, a video card or a modem work after hours of compilations and configurations feels just like "beating the system".
2. Well, there is more to IT besides editing udev rules and iptables profiles. But now my system works quite well, so I can go on with my life.
3. Update time. Or perhaps I need a new computer: maybe it's that I'm becoming old and impatient, but now I have stuff to do and other interests to nurture. I don't have time and patience to waste on making this damn piece of hardware work.
You want to make Linux a viable desktop operating system "for the masses"? Well, then don't tell me "it's free software, if you don't like it then write some code and change it!". Or "works for me". Thanks George.
Don't get me wrong: I like and support hacking, experimenting, FLOSS, open culture, open hardware, open whatever. I use FLOSS and Linux every single day and (for now) it works fine for me. But let's pretend for a minute that not everyone is a programmer. The truth is: even if I'm actually a professional programmer, I want an OS that "Just Works"(c).
PS: It's the same for... Linus Torvalds. He uses (as far as I know) Fedora. Not certainly Slackware, Gentoo or Arch (3 very good and interesting distributions BTW, I've used them all). His motivations? Paraphrasing: "because I'm familiar with it and I want to get things done".
Of course, work profiles do differ. But... I am so happy working with something that Just Works.
There will be no reason for most people to maintain a separate computer when their phone is powerful enough to do everything they need, except that its form factor is inappropriate. It will be cheaper and simpler just to dock it in a shell that gives a more appropriate form factor.
tldr; Android will disrupt the laptop/desktop market.
I used Linux for 10 years. Bought a MacBook to that, coexisted, and then switched to it full time a few months ago.
I'm actually starting to miss Linux a great deal, though. I'll probably miss some conveniences about the Mac when I go back, but I can't seen anything major. I like KDE and Gnome, and the Linux versions of Firefox, Opera and Chromium are just fine. I don't care in the slightest about Flash or Dolby sound (I listen to music on $15 'Altec Lansings' from WalMart). I don't play games. I don't care about office software. Skype works well enough for me, and will be dead soon anyway.
The ethernet driver complaint is bizarre. I never had a problem finding them in 2004, and why mention it now?
I've had problems with suspend/resume on Windows as well as Linux. You need to get everything set up right video-wise, etc. and I agree it is a pain. Power management, too - it needs to be improved.
A Linux user switching to a Mac makes sense. Using Windows is a bizarre idea - it's simply too unfamiliar and limited of an OS for me. It makes me feel like I'm locked in a padded room with a start button - powerless and declawed, and a lot of my favorite software is not available for Windows. No, I'm not installing cygwin or learning PowerShell. Mac's Unixness is comfortable for me, though I am grizzled by how some of the flags for the common command line utils must come before other arguments here in non-GNU land.
Some links to remember:
On each page there is a 'report a bug' link on the upper right.
Or, if you experience a crash, you can report the bug at the command line by typing 'ubuntu-bug' and a series of prompts will walk you through the data collection.
- Unresponsive (in general... win7 on a serious "desktop replacement" class laptop with loads of resources)
- Rebooting almost every day (application updates)
- Microsoft applications constantly crashing (mostly outlook)
- Randomly failing to hibernate
- Third-party support/driver applications which I should never hear about spamming me with questions (you scheduled a backup, do you really want to make a backup now? you scheduled disk maintenance, do you really want to run it? hey, I'd like to update, can I? oh, actually there are no updates available, kthnxbye. look at me, I'm updating the virus database!)
Unfortunately these are exactly the same reasons why I stopped using windows over a decade ago...
Three years ago I bought the latest asus laptop (M50sv) and lot's of stuff didn't work (sound, hibernate, bluetooth etc ) but within about 12 months with new installs of ubuntu things started to work to the point where now everything wor
I would much rather use a working version of Ubuntu than Win7 ( I used both at work and Win7 really sucks in comparison ) . But, as I said, the latest hardware always takes awhile and any seasoned Linux user knows that, even this guy.
It seems like what's really going on here is 'nix fanboys are upset at the level of vertical integration necessary in a distro like Ubuntu, to achieve a basic level of operational success at the cost of flexibility and horizontal modularity.
So go spend hundreds of hours per year hand-crafting your favorite versions of everything, and updating drivers manually every week or so, while bitching about how you want your money back from Canonical -- I'll just get on with doing some real work.
So? Use Ubuntu. Problem solved. Who cares what "most distros" are doing?
My main Linux workstations are a three year old QUAD core Shuttle XPC and a Dell business workstation. I religiously upgrade to each Ubuntu distro release every six months. Some things have gotten a lot better (like sound), then other stuff breaks. In particular, the Intel graphics drivers (like mentioned in this article) still are not quite right.
It's a little depressing how the fundamentals still are still not solid.
ZOMG, I forgot about my ASUS 1000 netbook.. The wireless get WORSE and WORSE every release. The wireless driver with 9.10 were awesome and now I can barely connect anywhere.
I started using Linux in the 0.9x days, back when I downloaded a SLS distribution on a 2400 baud modem and put it onto 20-30 5.25" disks. I then switched to Slackware (14.4kbps modem now!) and then Red Hat (in the v2/v3 days up until v6 or so).
Back then I typically just used multiple text consoles (alt-F2, etc to switch). This actually worked really well.
At the time you could run X... if you had the right hardware... if you configured all the right settings... if you had enough CPU/RAM... if (ad nauseum). But you could get it to work.
Thing is, there wasn't actually a lot of reason to use X. Programming was done in vi or emacs. You might want to use Netscape but the Web was embronic at best (we're talking 1995 here) and Netscape was a resource hog. I preferred to use lynx when I could.
Thing is, at that time I remember people saying "sure it's not great but it's getting better... give it a year or two and it'll be awesome".
15 years on I still hear the same thing and honestly I don't think it's much better. Sure the problems are different but Gnome still feels slow (not that Gnome existed back then, but you get my point). You still have to dick around with things just to get them to work.
From a combination of age and experience I both no longer value my time so low as to bother with this whenever I can avoid it and I no longer believe the lie about it getting better.
Take PulseAudio. I just don't get it. All i want to do is change my system volume and have it react relatively quickly (PA, in my experience, can have latency on volume changes of 100+ ms; yes this is a problem).
PulaseAudio, for me, exemplifies all that's wrong with desktop Linux. As someone described it, it's a solution in search of a problem, overengineering at its finest. It simply aims to solve problems I don't have or care about and fail at solving the problems I do have (unrelated side note: if you happen to be on the CSS Working Group, go back and read this paragraph 400-500 times until you get it).
Lots of apologists on this thread say OSX isn't a valid comparison because it only supports limited hardware. Nonsense. OSX is the way it is because the focus is on user experience and working well, not on, say, remembering per-application volume settings or being able to position particular audio streams (because, hey, it's a cool engineering exercise).
I'll use OSX whenever I can. I just wish Apple didn't make it so hard, like your desktop options are the Mac Mini (still Core 2 Duo, no thanks, my laptop had one of those in 2007), the Mac Pro (no thanks, proprietary hardware that is no longer supported 2-3 years later) or the iMac (can you make it any harder to install an SSD?).
I'll happily use Windows 7 too. It's stolen most of the best bits of OSX, enough that it's "good enough" anyway. Sure it's a nightmare of complexity too (policy settings anyone?) but at least using it on the desktop, unlike Linux, doesn't leave me wanting to stab out my eyes with a spoon.
I just wish there was a Mac Mini equivalent that used a Sandy Bridge i5/i7 (the 13" MBP has this, I don't see why the Mini can't) and a second hard drive instead of an optical drive (SSD + 2.5" HD for storage or even RAID1 SSDs for reliability). Integrated GPUs are fine.
Come on Mac Mini refresh, you're long overdue.
Anyway, the story of Linux on the desktop is the ultimate answer to the question of why shoot yourself in the foot when you can shoot yourself in the head.
But there's nothing wrong with that.
I am not blaming Linux. It is free anyway and done by a lot of smart and dedicated volunteers. My mindset now is I don't mind to pay some money so that my computing needs can be met without too much effort from my end. I am using OS X, and frankly I find Windows 7 is _very_ useable and quite pleasant to play with. Linux has its place in computing, and my desktop always has a partition for the latest version of Ubuntu. But for daily use, count me out. Thank you very much.
I'm very comfortable with cygwin, emacs, and putty on windows, and can move more or less effortlessly between windows and linux when working. The biggest annoyance is probably the slightly different copy-paste rules between putty and gnome/gnome-terminal. So I generally use Linux and Windows concurrently enough that whenever I encounter something that windows does better, I can just use that.
Because those four positives the original author lists are enormous:
* transparency, control
* package management
* lack of viruses, malware
- Poor modern hardware support. "Linux on the Desktop" is dead in the water without a devoted hardware supplier that writes all drivers and sells computers running Linux. The kernel itself is perfectly modern but we just often don't have proper driver support.
- Some desktop subsystems still suck. Is Audio bulletproof yet? PulseAudio seems too complex, OSSv4 is out of kernel and is not suspend/hibernate aware, Alsa is outdated.
- Desktop software is terrible. No office suite that feels solid, apps written with a mismash of backends, languages, and toolkits, everything feels designed by committee. No stationary canonical APIs to build proprietary software on. No commercial software suites (CAD, Office, Adobe, Steam, etc) or free software equivalents.
- GTK and Qt do not cut it. They are not attractive, they are not modern, interface design doesn't appear to be a priority, most devs don't want to write desktop software in C or C++.
The only way for Linux on the desktop to thrive is under the power of a commercial hardware company that designs systems with Linux in mind, gets the support of commercial 3rd party software companies, and completely eschews Gnome and KDE. That is a massive amount of work/money so I don't expect it to ever happen.
I have used Linux on the desktop since 2002 (and will continue to use it) but I'm not delusional. Funnily enough, I've tried going back to Windows 7 (I am dual booting now) but I inexplicably have more driver problems on Win7 than I do on Linux (Thinkpad).
I'm not a typical computer user, though. I could survive with only Emacs and Chrome.
What do they want to write desktop software in?
Linux can be seen as being a developers OS and if you are serious about development, the first thing you do is move to Linux. The amount of diagnosis and control it offers on the network and the OS through simple commands and DIY scripts is just unmatched. Though, the number of people involved in development is extremely small as compared to the number of regular home PC users.
The second kind, are the people who want to use Linux for web surfing, watching videos, playing some songs and do some office stuff. Here is the problem. Linux just does not fit this user portfolio. At least, it does not fit this as good as Windows does. OpenOffice.org and LibreOffice are light years behind MS Office. VLC is the only video player that brings some decency into video playback, and people are torn between music players.
Why did Ubuntu Linux and Linux Mint take off so well? In a world ruled by Windows, Linux has to understand its users and their Windows world view.
Wayland can be the savior for Linux and Unity is already turning some heads.
All my problems with @nix/@bsd and desktops start with "X".
The big wakeup for me is that the bug known, marked severe and has existed since 2005 without resolution. ,, It doesn't matter what distro or OS I choose. The core of the desktop has problems. I don't blame Linux, X or gnome devs for this, nor do I contemplate switching back to Windows or deserting to Apple. The culprit here is crappy software (eg: try reading through the various GDM shell wrappers) and closed binary drivers (nVidia) and the solution is writing better low level software.
I've had many of the same issues over the years, but it has progressively gotten better. Now have Ubuntu Maverick on my Dell Studio XPS and it runs like a dream. Video, camera, networking, sound, hotkeys, suspend/resume ran since first boot, and I didn't have to do anything. Printing surprised me too. When I went in to work one day I was printing after two or three clicks in the control panel. It was never that easy in Windows XP, tho maybe 7 is better. And keeping up to date is 10x easier than windows of course.
I did have a problem with Video crashing occasionally on first install but a subsequent kernel update fixed all that.
My tip for Ubuntu users (at least) is, don't upgrade to the newest release, stay back one. And if you do upgrade, wait a full month for the most egregious bugs to get fixed. I'll start looking at Natty when finished with my current project.
That said, OSX still has better application experience. I've used it for the first time as my primary machine at work for the past few months, and coming from a traditional Linux background, I'm sold. Strongly considering a MBP as my next hardware purchase. Idealistically I'd prefer to go the OSS route, throwing Arch or Ubuntu on a Lenovo, but it's really just not there yet.
I'm happy in OSX, but I'd love to see any linux distro partner/associate with a particular hardware maker. Create an install & tested machine list that 'just works' on one line of hardware. Grow it from there.
And since makers of that hardware, and makers of software for that hardware, are so keen to have their stuff work with Windows, they do most of the heavy lifting, leaving Dell etc to package stuff up and ship it out the door.
So while Dell etc might bust its ass getting a few Linux-based SKUs out there, they're going to have to keep up the hard work making sure it runs on the newest hardware, when the hardware manufacturers rarely care if Linux runs on their stuff at all. And as anyone who has installed Linux on brand new hardware knows, hiccups are not uncommon.
The balance of power difference is that with Windows, everyone else works hard to make sure their stuff works with Windows. With Linux, Linux devs work hard to make sure their stuff works with everyone else. And Apple just has to work well with Apple and their few suppliers.
Linux does hit a lot closer to the Windows model with Android, however.
Unless that kind of attitude and culture of blaming the user changes I am afraid Linux won't be winning many desktop users from the mainstream.
However, it sounds to me like this guy had just had enough with the platform for whatever reason - and then proceeded to make a bunch of decisions that put him on a collision course with failure. Not saying he didn't run into valid issues, but someone with that much linux experience should have well known about some of the issues he was about to run into.
Running Fedora 15: This got released a bit more than two weeks ago. Fedora always seems to have a bunch of issues for me in the first month or two until they get a bunch of user submitted problems ironed out. I know this well with F15 right now - I'm running it and have run into power management problems, nfs blocking hibernation, occasional extreme gui lag, an incompatibility with gnome shell and fglrx etc. There are tickets open on these and I'm confident many of them will get resolved. F14 has none of these issues now - if you want a super stable and reliable desktop experience you generally shouldn't run a brand new Fedora release.
Running gnome 3 / gnome shell: This is also brand new and it should be somewhat of a nobrainer that there are going to be some issues. Early KDE 4 had similar issues. But then again so did Aero when Windows Vista was brand new. Want a stable, polished experience? Run something that's been worked on for a while - this is more or less true in any OS.
Driver issues with new hardware. Maybe it shouldn't be this way, but it's a reality. If you don't want to fuss with drivers consider buying stuff that's been out for a little longer. My ironlake gpu is supported 100% perfectly in fedora 14. Then again I've seen F15 running on a sandy bridge gpu working just great so I'm not sure what issue he's running into. It seems quite possible that the dual gpus are confusing the issue. My laptop with dual gpus (ironlake and radeon) has some issues until I ban the radeon driver from loading at boot (it can still be loaded before switching to the amd gpu before X loads).
A lot of his rants about GPU drivers honestly don't sound very accurate to me. Intel drivers are garbage? Not in my experience - it seems to me that the Intel maintained open source drivers are some of the best of the open source solutions and generally work out of the box on everything I use them on. Maybe he's running into a problem because Sandybridge has really only been practically in use for 4-5 months. AMD drivers are garbage? There is an element of truth here but it's quite overstated.
If linux doesn't agree with him any longer that's perfectly understandable. Hopefully he'll be happier under Windows. But to me it sounds like this guy has some rage issues and has either warped his own perspective of the situation due to them or is intentionally looking to make the situation look worse to the reader.
All hardware just works isn't a quality linux has ever suggested applies to itself. But that is hardly rare for an OS. The only OS that applies to is Windows (generally) and that's because that's the only platform all vendors bother to target. You can't buy arbitrary hardware and expect it to work at all under OS X, Free BSD, Solaris, Android Open Source Project, Windows Mobile and iOS.
In fact, out of that whole list the odds are that an arbitrary hardware configuration will work out of the box are highest on linux.
If you want any of those platforms to work well with a new purchase you have to determine beforehand the compatibility. So linux is hardly some sort of freak in that way.
But don't let that get in the way of a good rant.
"Hardware just works" is a characteristic that normal people consider important to an operating system. If you don't do it, you are deficient in that area. "But all these other also-rans fail at it too" is not an excuse. The standard, for better or for worse, is Windows, and you either meet the standard or you don't complain that people get tired of dealing with your failures and move on.
And it's certainly not surprising that even a Linux enthusiast gets tired of dealing with substandard, broken hardware after a while. I know, I was one. Eventually I got tired of trying to balance having decent, modern hardware with having hardware that supports an operating system that, on the desktop, is essentially a novelty (it affords me nothing that Windows or OS X doesn't), and I tubed it. Because it wasn't worth the effort.
The only way Linux makes sense is if it is competitive on at least one level, and it currently is not.
I haven't made detailed comparisons, but I expect System76 would be competitive for more mainstream laptop configurations as well.
The reason people pay more for a Lenovo machine is because it's built like a tank. You pay more because you get more. System76, on the other hand? You pay less and you get a lot less.
EDIT: System76's build quality looks even worse than it used to--those are truly hideous plasticky machines. They're also pretty underspecced and crappy, on the low end. The only processor available in their 13.1" model is an i3-330UM running at 1.2GHz. My x201 has an i5, and you can get them with i7's. It skimps in the peripherals, too: it comes with a 5400RPM hard disk and a smallish battery. When you up it to a 320GB 7200RPM disk and a halfway decent battery, you're already up to what I paid for my X201. Adding 8GB of RAM puts it over (although that's somewhat unfair because I bought my RAM off Newegg).
The machine also weighs more than my x201, has one of those weird button-y keyboards, no trackpoint, has a glossy screen, and less pixel density. Fails across the board. So, at least for me, System76 doesn't even begin to compete with Lenovo. Even if you don't consider Linux on the desktop a minus, it doesn't make a lot of sense to buy what looks in most ways to be a cheapie Dell knockoff.
For example, KDE4 caught a lot of flak for shoddiness and flakiness before its first release. That happens to a lot of software, and it's completely unfair. I criticized people for expecting KDE4 to be polished enough for a general audience before the point-0 release. However, when it actually was released, it was still in terrible shape. I first installed 4.1 and gave up on it pretty quickly. Not only was it woefully incomplete, but the aspects where 4.1 caused me the most grief (cosmetic glitches, unpolished UI elements, non-intuitive and non-discoverable customization) were supposed to be major focuses of the KDE4 effort.
Then it was my turn to be criticized: KDE 4.0 and 4.1 were intended for early adopters, and it wasn't until KDE 4.2 that KDE4 was deemed a good replacement for KDE3. Well, I admit I didn't read the release announcements, but what the hell is a major release supposed to mean, then? What does it mean to go from beta or RC to point-zero if the point-zero release isn't ready for most users? There's a rule of thumb that conservative users who absolutely need stability are wise to sit out an x.0 release, but there sure as hell isn't any convention that most users should wait for an x.2 release. And why did my distro suggest 4.1 as the default KDE to install?
Clearly the open-source world needs a more explicit and widely-followed convention for labeling releases. Projects are willing to cheat and release x.0 versions that aren't ready for general consumption, just so they can make an artificial deadline, keep up with their competition, or even just to ensure plenty of users for testing a new version. And they have no shame about it. If a project as high-profile as KDE does it, anyone can defend it based on their example. There needs to be an explicit convention for labeling releases so each user knows which release they should wait for.