Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Linux is a usability nightmare the second you get out of the fake easy-to-use illusion layer they added with the new GUIs. Unless you’re a coder, don’t even think about it.

Do you think he really believes this? Maybe I'm too far in a bubble or something, but that sounds completely dishonest.




I do not just believe it, I know it. When something does not work in Linuxland the answer is still "open the terminal, enter a bunch of cryptic commands" and/or manually edit text-based config files. I can do that easily, I grew up with DOS. We had to tweak the parameters of the memory manager and make custom boot disks with just the right set of drivers to get games working as kids. It is just that I am not willing to do that stuff anymore. This is the 21st century and my time has value.

Recent examples of my Linux "user experience":

- (Fedora) Internet went down while the package manager (OMG I hate those!) was busy updating the system. This left the package manager in a corrupted state. I could no longer use the GUI to install/update anything, only gave me a "please fix me somehow" type error message. I could easily google the solution. Four cryptic terminal commands later it was good to go again. But as I said: unacceptable for a modern desktop OS.

- (Ubuntu) Getting my wifi to work. Do I even have to start? I ended up having to download a kernel module .tarball from somewhere, had to integrate that into the kernel, and edit some config files by hand. Again, no problem, Google is magic. However it wasted one hour of my life: unacceptable. On Windows getting the wifi to work required.. clicking on setup.exe.


The Linux community doesn't help either. Some of them want to be inclusive, but the majority of the community think that Linux should be an exclusive club where only those with a high level of knowledge should get treated with any kind of respect or help. Just go Google any Linux topic, and I assure you the first page will contain several forum post where the only response is "RTFM," "if you cannot figure this out you shouldn't be touching it (audio issues)" (paraphrasing), or even "maybe you'd be better off using Windows teehee."

I think Ubuntu has done much to make Linux more consumer friendly and I congratulate them for that. But realistically Android or Chrome OS (Chromebook, etc) are the only two Linux OSs I'd likely use on a daily basis because in both cases they abstract you away from the ugly Linux innards better than any standard distro' is able to do.

The fact that Linux depends so heavily on the terminal/console is just pathetic. It reminds me of Windows 9x. Which is kind of depressing when you consider that the Linux kernel is the most advanced kernel currently in existence, but they get weighed down by the UNIX legacy stuff, users, and GNU side of things.

That's why Android is so wonderful. Instead of it being Linux/GNU, it is Linux/Android. When we get a full desktop OS without the GNU gunk and the associated bad-attitude ("terminal is the bestest, I am so 1337!!!") I'd happily switch to it. Hell I'd switch to a Linux/Android OS if Google and friends made one for the PC desktop (with real windows/multi-tasking).

OS X is also a great UNIX OS because there is no terminal fallback. You can do 90%+ of things you'd ever need to do on OS X via UIs and tools. Plus the community on OS X is better than the toxic Linux community, even if they're a little defensive when people criticize Apple/OS X.


Not disputing the problem you discuss exists - it certainly does - but be careful of assuming it's the majority of the community. It often happens that the bulk of the visible noise is made by a minority of obnoxious wankers.


That's a very interesting take on the situation. I forget that android is linux, when not browsing files. Come to think of it, my android phone has been more useful and friendly than any desktop distro. Regarding legacy problems, I recently read that even the directory system is a legacy issue.


> The Linux community doesn't help either. Some of them want to be inclusive, but the majority of the community think that Linux should be an exclusive club where only those with a high level of knowledge should get treated with any kind of respect or help. Just go Google any Linux topic, and I assure you the first page will contain several forum post where the only response is "RTFM," "if you cannot figure this out you shouldn't be touching it (audio issues)" (paraphrasing), or even "maybe you'd be better off using Windows teehee."

Just in case anyone thinks you're making this up... For several years, the top result for a search on "ubuntu add user" was this page:

http://www.howtogeek.com/howto/ubuntu/add-a-user-on-ubuntu-s...

It's an interesting page, because it recommends the use of the low-level "useradd" instead of the easier "adduser". It mentions "adduser" but only way down the page.

Now the Debian/Ubuntu man page for "useradd" says "useradd is a low level utility for adding users. On Debian, administrators should usually use adduser(8) instead."

http://manpages.ubuntu.com/manpages/lucid/man8/useradd.8.htm...

Of course I made the outrageous mistake of thinking that a "how to geek" site might have accurate information, so I followed along and used "useradd".

When that command left me in a confusing state, I posted a comment on the how to geek page suggesting that they should recommend "adduser" instead of "useradd" just like the Ubuntu man page recommends.

(Click the "show archived reader comments" link on the page to see the discussion.)

This did not go over well. I got these replies:

> Michael Greary you are a D&%K H#$&D. You’re the 1 that messed it up no1 else, and anyone else for that matter. Hopefully you’ve learnt by now that when blindly running linux commands you should read the whole article to make sure it’s what you want first. Blaming it on other ppl isn’t gonna help either. You should’ve said it by admitting your mistake, asking for a way to rectify it. (sigh) I bet you have no friends

And the somewhat more polite:

> Michael Geary, you shouldn’t blame the tutorial because you didn’t read it properly.

And this was after following a "how to" that recommended using the wrong command.


> ...Linux kernel is the most advanced kernel currently in existence,...

I'm not sure what this even means.


I'm fairly sure it doesn't.


To counter your anecdotes, here are some more anecdotes:

- Windows XP - Power went out at a critical stage of updates. Completely hosed the system, and wasted hours of my life having to rebuild (including rebooting multiple times to get updates installed again after the recovery).

- Windows 7 - Cable giving my router IP addresses. Called ISP tech support... they had me manually plug the cable in and type cryptic commands at a DOS prompt. Is that unacceptible too?

Also, I'm really baffled as to how the network going down could kill a Fedora update. The downloads happen before any significant filesystem modification in every version that I have ever used (certainly any recent one). I guess maybe some third party package does something weird during a POSTINSTALL script? I dunno.


The problem here is that the super poster is criticizing the experience under Linux when you install any of the OSes on non-vetted hardware, whereas 99% of Windows computers come preinstalled with all the drivers properly configured.

If your Windows system breaks, and that means anything beyond "oh I'm missing a driver", you have no recourse, you have to reformat. You end up googling and doing the exact same procedures you would under Linux, except because the OS internals are obscured and proprietary, you don't have whole stack detailed analysis on whatever problem you have, you have a black box and you have to hope the tools MS gives you are good enough.

It doesn't change that for 99% of users, regardless of OS, when you encounter a problem you don't try to fix it yourself. Most people aren't going to even try, they are either going to take the system to an expert who would use a terminal no problem or just get another one.


I experienced this myself recently. The Netflix metro app decided that something was wrong with the DRM path on my two-day old install of Windows 8.1. After much googling, I ended up having to re-install the OS. It was a very similar experience to when something breaks on a Linux install, except that I had less ability to go digging into the internals (not that that would be helpful for someone who doesn't know a lot about computers).

I think the primary difference with Linux is that developers are fairly content mucking about with things, so breakage that would be a showstopper for someone who is not intimately familiar with computers goes somewhat unnoticed. It's hard to step back and say "what would I do if I didn't know about <x,y,z>?" in daily life.


I've been intentionally writing KCM control modules for various traditional command line functionality recently, with the intent to get them in some early series 5 KDE release. Just proof of concepts, but so far I've gotten working ones for xrandr, wlrandr, and saned (ie, network scanners). They aren't usually very complicated, just a bunch of widgets. I intend to finish them with a qml rewrite, most of the work was before you could embed qml in widgets.

The point is, yeah, you shouldn't have to go to a terminal, but that requires the development of guis to overlay the complexity. I use my relatives as test beds for whatever breaks that requires a "control panel entry" to fix.


Yeah, hasn't Fedora used transactional updates for ages now?


Power outage is a harder ball.


> When something does not work in Linuxland the answer is still "open the terminal, enter a bunch of cryptic commands" and/or manually edit text-based config files.

That's true in OSX too, though. After upgrading to 10.9 ("Mavericks") I had a 'systemstats' process taking up 100% CPU and >3GB of RAM at random points. Did some googling and reading through forum threads, and the solution was to muck with some BerkeleyDB files in a terminal. Is OSX an acceptable modern desktop OS?

Windows in my experience has been even worse, in that for OSX there at least usually exists some arcane way of fixing your problem, while in Windows it's just "wipe and reinstall", because the problems are totally opaque. (I don't have great memories of the registry editor, either.)


Nah, the new solution in Windows is "do a system restore", which is way easier than reinstalling everything. Works pretty well, too...


Been managing a large Windows network for ~8 years; never once yet seen system restore fix a problem. It also seems to frequently introduce new ones.


I haven't had wifi issues with Ubuntu for over 5 years. What computer did you have these problems with, where you needed to install a new kernel module?

You hate package managers? I don't have experience with Fedora's and I don't know why it entered an inconsistent state, but OSX/iOS use a package manager. Android uses a package manager. What problem could you possibly have with package managers? Do you enjoy manually updating software one app at a time?

The answer in "Linuxland" may still be to open the terminal and enter commands but the times a regular user needs to do that are becoming few and very far between. I have the majority of our (albeit small) office running Ubuntu 13.10 and I've yet to encounter anything like what you've described.


I think he is not angry at the package manager, he is angry that the package manager failed, and he had to use the terminal to fix it.

And it IS not acceptable, I use Fedora myself, and every time this happened I got REALLY, REALLY upset, if I am using a GUI package manager, why the hell I should need to use console to fix it when it breaks? Why there is no option on the GUI to do that?

Also I had a machine that I had to install a kernel module to put sound on it... Linux still have some serious issues with some hardware (mostly audio and radio-based communications, I've heard printers are problematic too, but it is years that I don't own a printer)


>I haven't had wifi issues with Ubuntu for over 5 years.

No surprise there. Some wifi hardware is supported out of the box and will just work. That is not the problem. The pain starts when something does not just work.

>What computer did you have these problems with, where you needed to install a new kernel module?

Given that I am a major nerd I have multiple ones and admin even more ones. I think that particular episode happened with an Asus Eee Box B202 and an older version of Ubuntu.

> but OSX/iOS use a package manager.

I do not use anything Apple but if I recall correctly OS X uses self-contained .app bundles i.e. the apps usually include all their dependencies except those present on a particular OS X version by default. It is the "dependency management" done by Linux package managers which I hate. Namely that every app becomes a tangled web of dependencies with unlimited potential for pain. Apps should be self-contained, the OS should be a stable, well-defined platform to run them. I.e. "requires Window XP or newer" or "requires Mac OS X 10.6 or newer" is okay, "requires [long list of packages]" is not. That model will cause problems sooner or later. "I updated X and suddenly Y no longer works" etc.

As a programmer I find the concept of simply automatically updating libfoo 2.1.x to libfoo 2.1.y because "the developers say it does not break compatibility" scary. Software is written by people, developers accidentally break compatibility all the time, or write software which depends on buggy behavior, behavior which later gets fixed by other developers, thus breaking said software. Thus all common Linux package managers are themselves broken by design. They are only acceptable if you accept that occasionally some things will just stop working after a system update.

>Do you enjoy manually updating software one app at a time?

I do not often have to update any software and when I do it I can usually try the new version without uninstalling the old one easily and I know that its installation will not affect any other apps on my system. Yes, I enjoy that.

>the times a regular user needs to do that are becoming few and very far between.

No disagreement there, but as I said, personally I am not willing to put up with an OS where this happens, no matter how rarely it happens - unless I have to.


Main issue seems to me that Linux distros use the same mechanisms for installing/updating userland apps/code as for installing/updating system-level libraries and kernel stuff. It's never quite felt right to me, but has allowed and seemed to encourage the weird dependency trees that end up wanting replace entire GUI toolkits because a desktop app was compiled against a certain version of libfoo, which then blithely tries to update 900 packages on your system when you just wanted one little app.

This fear/anger/resistance towards static bundling in the linux world is the main culprit, it seems, and I don't think anything short of a cultural revolution in the linux world will change that. The last cultural revolution seemed to be a large mass exodus of desktop linux folks migrating to osx a decade ago.


I understand your concern for "dependency hell", but to be frank, your arguments come off uninformed to me.

    Thus all common Linux package managers are themselves broken by 
    design.
I can't really see how you'd debate your way out of that. If you mean package management is inherently broken because different software may depend on different versions of libraries then that sounds like an issue you'd have with the software developer. I know that in itself could be a gripe with Linux software, but show me one regular computer user who has ever experienced broken package dependencies. The vast, VAST majority of regular Ubuntu user software is installed through the Software Center and the only times I've ever gotten broken dependencies were when I was specifically trying to install an older version of something.

    I do not use anything Apple but if I recall correctly OS X uses
    self-contained .app bundles i.e. the apps usually include all 
    their dependencies except those present on a particular OS X version 
    by default.
OSX has .dmg files for installing applications. Debian's .deb packages are the exact same thing. Bundling all dependencies can be done in Linux the same way -- again the app developer can statically link to libraries.

The fact of the matter is that Linux is NOT a usability nightmare and it's dishonest to say that. Experiencing some arcane error with an old version of Ubuntu on a six year old computer doesn't afford anyone the right to claim that the modern linux OS is a usability nightmare.


I'd be careful accusing people of being uninformed and dishonest, especially when you make errors in your own post.

.dmg files are disk images. In the case you are discussing the disk image contains an .app package that is copied to the users (or more generally, the systems) Applications folder. A closer equivalent of the .deb file would be the .pkg file, which works in a similar manner as you describe.

As to whether 'Linux' is a usability nightmare or not I'd argue is purely subjective. It is disingenuous to suggest that there isn't a steeper learning curve, however slight and for whatever reason it exists.


We're talking about dishonesty in saying that "Linux is a usability nightmare" from an article about improving Windows' usability. I'm not talking about learning curve here! He cited 2 examples regarding Linux's usability that I don't think are relevant at all.


>The fact of the matter is that Linux is NOT a usability nightmare and it's dishonest to say that.

This is not a fact at all. It's a matter of opinion. The author of TFA is equally wrong in asserting that it is in the way that he did, but dishonest? No more than the dishonesty of claiming that it factually isn't.


>As a programmer I find the concept of simply automatically updating libfoo 2.1.x to libfoo 2.1.y because "the developers say it does not break compatibility" scary. Software is written by people, developers accidentally break compatibility all the time, or write software which depends on buggy behavior, behavior which later gets fixed by other developers, thus breaking said software. Thus all common Linux package managers are themselves broken by design. They are only acceptable if you accept that occasionally some things will just stop working after a system update.

Sounds like you'd be interested in NixOS.


>but the times a regular user needs to do that are becoming few and very far between.

How much is a few?


In my experience, Google is only mostly magic. This is probably the fault of the Linux community. Too often the process for how to fix something in Linux is listed on a website as a few cryptic commands, then "and you've got it from there, right?" when you're completely not done yet and there are still several steps to go that they done explain. Its infuriating.


Not to mention that lots of the top google answers on things date back to forums posts in freaking 2005 or so with answers that are worse than useless.


"When something does not work in Linuxland the answer is still "open the terminal, enter a bunch of cryptic commands" and/or manually edit text-based config files. I can do that easily, I grew up with DOS."

I can install CentOS on a laptop, add repositories and install multimedia codecs/applications, change the language and set periodic updates all without any use of the terminal at all.

I suspect that part of the reason that terminal commands are given on support fora is simply that it is easier to provide the text command than it is to describe a series of steps in words that you have to execute on a GUI.

I also suspect that another part of the reason is that people who really know about Unix like OSes tend to be from a sys admin or programmer background and are more comfortable with the terminal.

Agreed that this is going to have to change soon!

"- (Ubuntu) Getting my wifi to work. Do I even have to start? I ended up having to download a kernel module .tarball from somewhere, had to integrate that into the kernel, and edit some config files by hand. Again, no problem, Google is magic. However it wasted one hour of my life: unacceptable. On Windows getting the wifi to work required.. clicking on setup.exe."

Well of course Windows is easier to find drivers for, it is the operating system for which the cheap commodity hardware you can buy is designed. Having said that, I'm surprised that Ubuntu's restricted driver identification was unable to find the WiFi kernel module you needed. What was the module you installed and which version of Ubuntu did you install it on? My reason for asking is that having 'manually' installed a kernel module, you may have to repeat the exercise for each kernel update.


Have you ever done any troubleshooting in Windows? I spent over a year as tech support for Dell supporting Windows, and I honestly can't say that Windows is any easier to fix. Most of the time(outside of PEBKAC errors), I ended up a) enter obscure commands in cmd b) making an obscure edit to the registry or c) re installing windows. No operating system is easy for end users to fix when something isn't working right and at least in linux you very rarely have to reinstall.

Your example of double clicking on setup.exe to fix the wifi is pretty wrong. First of all, you have to find the .exe file which is pretty hard for joe average and you have to make sure its the right one, etc.


You would say the "unless you're a coder" part is wrong, but otherwise it has the ring of truth based on my experience.

Configuration when not using UIs is hell because each application has its own format, settings file location (if it uses settings file at all -- you might be required to run a curses console program) and generally its own idiosyncracies. For a single logical concept (e.g. sound, ditto for display), a plethora of layers are involved and they all clash heavily, both in term of settings and abstractions.


I can understand him saying that if he used, say, Arch. But distros like Mint are arguably easier to use than Windows.

And then they have the advantage that they don't have to be 'fixed' like Windows 8, which destroys the point of the post.


I'm sure someone still hates Unity, but I use it every day. It's a conservative design that evolves slowly. I wish the Ubuntu Touch UI had already been integrated with mainstream desktop Ubuntu so I could get a PC tablet as my day to day coding machine.

It is very difficult to evolve a non-touch UI into a back-compatible touch UI. But the conservative approach Ubuntu Touch is taking might turn out to work better than what Microsoft did to Windows.


I have a windows 8 tablet / laptop hybrid as my hobby coding machine (asus t100), and i have to say that the bad rep windows 8 got does not match my experience. I've found the touch ui easy to use, and i even configured it to launch my ide in full screen mode, so i can go back and forwards between that and my browser without seeing a desktop in between. I've also noticed that i'm touching the screen quite often even while coding, and that's with an ide completely not optimized for touch (webstorm). The whole thing has convinced me that touch is an added value for any use case, even for programming.


As far as I can tell, the "bad rap" is mostly from people who haven't used it at all, or haven't used it for very long. (Typically XP users who are extremely averse to change.) The Windows 8 users I know personally like it a lot, though that doesn't stop them from complaining about the parts they don't like....


I've been using linux for literally 20 years now. The first ten I did all my own sysadminning, the next ten I used it exclusively in organizations with professional administrators to complain to, and now in my current job I'm back to having to do things myself, and I can assure you it's just as much of a pain to do things in 2014 as it was back in 2002.

They gave me a new monitor at work and it took me literally six hours to figure out the right magic to get the resolution in a mode that wouldn't cause instant headaches, and I still can't run KDE applications without the X server segfaulting.


What distribution are you using ? And what graphics card? I remember some issues with the first version of KDE 4, but since then I run it on Intel, AMD and NVIDIA cards, with open and proprietary drivers without big problems (AMD sometimes have glitches..)


Ubuntu (13.04 and 13.10) and a recent nvidia, don't have the model number here.

My best guess is that it's some kind of font issue, plotting in R also segfaults the server but I can see the non-text part of the plot come up. I try to leave well enough alone because (a) I get paid to do work not to troubleshoot linux and (b) there's no guarantee that anything I do will result in a usable system.


Yep that's pretty much mu experience with Linux. By the way I really like Linux (I used to use Ubuntu before they added Unity), however the constant fiddling with drivers and all kinds of stuff is just annoying.


What graphics card are you using?




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: