What you seem to be saying is that in order to use a personal computer, you should be an expert.
I disagree, and I'm glad that most people disagree as well. User-friendliness is important regardless of what system it is, even Linux. Have you noticed that the most successful distributions of Linux are the ones that are most friendly to beginners?
This elitism is destructive and serves only to make the "experts" smug and keep everyone else in the dark. I've found that the real experts are the ones who are writing guides for solving issues so that beginners can solve them too. That way, everyone benefits... and the beginners get to learn too. Almost all of my experience with config files has come from having a problem, Googling it, copy-pasting some commands into the terminal, and then becoming interested and messing around with it.
Ubuntu has two commands for this, useradd and adduser, where adduser is the easier one for newbies, while useradd is more flexible but has a lot of ways you can get it wrong. In fact, 'man useradd' recommends using adduser instead.
Unfortunately, that how-to page starts with useradd, and the first example on the page is:
It isn't until later on the page that it mentions that there are some options you must add to the command if you want it to set up everything correctly for a typical user. And if you leave these options out, the page doesn't explain how to fix it after the fact.
Finally, near the bottom of the page it mentions the adduser command that would have worked with no special options and no hassles.
So, being at the time a Linux newbie, I worked through the how-to and tried the useradd command it suggested, got myself into a situation where I wasn't sure exactly what I'd really done and hadn't done, then finally discovered the adduser command I should have used in the first place. I posted a reasonably polite complaint that the how-to had led me down the wrong path and it ought to start with the recommended adduser command instead of just mentioning it in passing at the end.
The responses I got were fairly eye-opening. (Click the "show archived reader comments (36)" link to see them.)
This one was my favorite:
> Michael Greary [sic] you are a D&%K H#$&D. You’re the 1 that messed it up no1 else, and anyone else for that matter.
> Hopefully you’ve learnt by now that when blindly running linux commands you should read the whole article to make sure it’s what you want first.
> Blaming it on other ppl isn’t gonna help either. You should’ve said it by admitting your mistake, asking for a way to rectify it.
> (sigh) I bet you have no friends
I have to admit he was right. Clearly, I am not worthy of using Ubuntu!
Of course, you tacitly skipped over the other helpful comments and instead chose to concentrate on the worst comment in that thread. However, that doesn’t render the message delivered there wrong in any way – read the whole article/how-to before starting, just as you read the whole recipe before starting to cook.
So, yeah, you
a) didn’t read the manuals supplied with your distribution
b) instead went directly to a random third party site via Google
c) blindly trusted said site without fully reading it
d) messed up while doing so.
You should have gotten a computer-literate friend to help you, just as you don’t fix your car by looking for an howto online and dismissing the manual you can usually find in the glove compartment.
The thing is, of course I know I screwed up. I don't need you or anyone else to tell me that. I already know it.
But that has nothing to do with my complaint: the top match for a Google search for "ubuntu add user" is a site that gives out poor quality information, doesn't improve that information as a result of user feedback, and has a community that is positively hostile to feedback. Yes, as you point out, there were also helpful comments in that thread. There were also several other people who had the same problem I did, and the site never did anything to improve its content.
Isn't it obvious that this was a lousy tutorial? Doesn't the author of a site like this have any responsibility to put out useful information? Does the fact that I screwed up make my feedback any less valuable? If you're writing tutorials, that's exactly the kind of feedback you ought to be looking for, so you can improve your tutorials.
Contrast it with Arch Linux's page on user management:
Here they do use the same useradd command that I was complaining about (probably because Arch doesn't include adduser by default?), but they give a complete example with all the options required, followed by an explanation of those options.
Now that's how you do it.
But we're all newbies at something, some of the time. In my case it was basic Linux user setup.
Maybe you even screwed something up once, as a result of somebody giving you bad information that you didn't thoroughly check out. (Not saying you did, just possible.) If that ever happened, which would be a better response from the source of that information: chewing you out thoroughly, or correcting the bad information?
And this is what's so frustrating. I love Linux. I truly love the idea of making everything open from the kernel to the desktop environment to all of the applications you can run on it. I truly love the idea of customizing everything... and if you don't like what's there, you can modify it to your heart's content if you learn how. That's just freaking cool to me.
Except... we have douchebags in the community who talk shit to newbies and then cry, "Why does everyone use Windows? How come Linux isn't more widely used?"
Just check your post history and read the number of times you wrote "RTFM." That's why.
Well, there's shit-talking the newbies, and then there's refusing to lower your system to the lowest common denominator by dumbing it down.
Not being careful of that second one gets us things like the GNOME 3/Unity/Windows 8 design-by-committee that seems to be loathed so much by power users.
Is it so wrong to expect some basic minimum level of competence in a system before someone uses it? And then suggest that someone take the most basic of steps to acquire that competence before asking questions? (By R'ing TFM?)
Systems should as a rule be somewhat intuitive (IMNSHO), but you can only take that so far before you start hamstringing yourself. Devs would never get anything done if they spent all their time training people how to use systems.
You're definitely right that there is a fine line between user-friendliness and "The user is an idiot" in UI choices. Right now I'm using Lubuntu; I dislike Unity immensely.
Remember that everyone starts out incompetent. They ask dumb questions, and in some cases entitled newbies expect the experts to bend over backwards for them. But one thing that's nice is that a lot of questions have been answered before by patient experts.
Say there's a guy on the Ubuntu forums who asks the following question:
"Hey, how do I change my desktop resolution?"
An asshole answer is "Fucking Google it, idiot. It's not hard."
A decent answer is, "Here's a link to the answer. In the future, I suggest Googling your issue. You don't have to wait for us to answer it! Often, questions have been asked and answered before. We're here if you get stuck, though."
This has a few benefits. The first (obvious) benefit is that the newbie doesn't immediately think "Wow. What a dick, I was just asking a simple question."
The second benefit is that future newbies who Google the same question get a link to the answer.
Of course, there are lost causes (Where's the Start button? This isn't like Windows at all. I don't like it) and asshole newbies who simply refuse to try to learn anything by themselves, (Ok, you answered my question on desktop resolution. How do I change my desktop background?) but I think these people are the exception rather than the norm. And they can be dealt with in a way that is constructive rather than crass and uncouth.
Try giving the ‘decent’ answer to the thousandth person asking that very question without any sign whatsoever of the work they put in. Or to the person who obviously did not read the rules for the mailing list they signed up to and instead ‘just asks a simple question’. There are, unfortunately, many more entitled people than you think.
Fortunately, you can easily killfile people in mailing lists, and doing so quickly (i.e. after one wrong post) helped me greatly.
> What you seem to be saying is that in order to use a personal computer, you should be an expert.
No. What I am saying is that a) using a device is different from administrating it and b) in order to administrate a device you have to be an expert. You don’t expect to buy a car and never ever go to a workshop with it, because, you know, you shouldn’t have to be an expert to service your car. Similarly, expecting to be able to service a PC without being an expert is as futile.
> Have you noticed that the most successful distributions of Linux are the ones that are most friendly to beginners?
What metric/definition of successful are you using? 
> (third paragraph)
Exactly. Your point being?
 Hint: Number of users is not a measure of quality, and hence not necessarily a measure of success.
I would argue that popularity is actually a great indicator of the quality of an OS and your experience with it.
I am what you would consider an "expert" (or at least, I can definitely competently administer a linux computer) but I don't use linux distributions for day-to-day computing. The reason is that linux computers, even ones running "popular distros" such as Ubuntu, are considerably more prone to interoperability foibles and installation/updating problems.
The reason popularity matters, even though I have the technical skill to administer any OS competently, is that popularity is the single best incentive to encourage third parties to iron out the kinks on a particular platform. The reason AMD drivers work better on Windows than they do on Linux is not because Windows is "better" by any of the metrics of quality you espouse, it's because Windows is more popular. The reason Steam for Linux is only supported on Ubuntu is not because Ubuntu is "better" it is because it's the most popular distro. The reason to use popular platforms is because all of the little things that require a half-hour of expert time to work around would have already been solved by the vendor who recognized that such issues are worth testing for and fixing for a platform with 100 million users, but not for one with 20,000. That's distasteful to some because it doesn't seem meritocratic and it's out of the control of the creators of the OS, but it's a fact of business, and it does affect the quality of an OS in tangible ways.
As for the "expert" thing, the costs really aren't that different for various kinds of users. The reason I care about this stuff is because the opportunity cost of my time spent solving menial issues that would have been solved by the vendor if I were on a more popular platform is too high. The reason my mother cares about this stuff is because the transaction costs of finding someone who will spend the time to figure out what's going on and fix her computer is too high.
> The reason popularity matters, even though I have the technical skill to administer any OS competently, is that popularity is the single best incentive to encourage third parties to iron out the kinks on a particular platform.
I prefer a working platform that forces me to read the manual beforehand over a platform where I don’t have to read the manual but random third parties iron out kinks.
Popularity is a measure of quality, but only of quality wrt to the usecases of the people among which the product is popular. My computer usage is very different from my mum’s, and likely from 95% of the population, hence, the popularity of a product among 95% of the population is rather irrelevant to me.
If you’re a single person with a tight budget looking for a car, you don’t look at car popularity among large families owning oil fields, but among people similar (in the relevant ways) to yourself.
Hence, popularity among a large user base is an indicator that the OS is optimised for use by a large part of the population. If you find that this is indicative of the quality you’re looking for, fine.
I was specifically using "successful" to mean "number of users." People use what they like, and Ubuntu has gotten a lot more users than, say, Gentoo. Is Ubuntu superior to Gentoo? That's a matter of opinion; I'm sure that there are plenty of applications where people have decided that Gentoo is the perfect fit for what they want to accomplish. But for personal computing needs, people tend to use what is most user-friendly.
I don't mean that it's okay to be completely illiterate in how a computer works, especially if you're using Linux. To take your car example, I don't think that a driver should know how to rebuild his engine. I do expect the driver to know how to change his oil and brake pads and know about scheduled maintenance.
The same is true for computers. A Linux user should know basic bash commands, how to install software, how to research problems, and so on. But he shouldn't be expected to know how to troubleshoot driver problems or know the ins and outs of xorg.conf. That's what Google and the aforementioned benevolent experts writing guides are for.
While I agree that we should do everything we can to be user-friendly and not allow what I'm about to suggest to take the form of "elitism", I think we underplay the importance of helping the user understand the system. I think your typical user should understand SSL before they do online banking and shared libraries before they let their work rely entirely on their computer, in the same sense that I think you should know how to change a tire and change your oil before you buy a car or go on a long road trip.
Yes, they all should just "work out of the box", but in a more and more literal sense we trust a lot of machine with our lives. We ought to understand all of these machines better before we trust them with our lives or vote on the leaders who will regulate their use.
We have abstraction layers precisely so that you can use things without understanding them. It's not just computers, every day we use all kinds of technology without properly knowing how it works.
Of course, sometimes those abstractions are leaky, and we have to change a tire or apply an update. But suggesting that typical users should read up on SSL (and therefore asymmetric cryptography) before they use online banking is as ridiculous as saying that every new driver should study the detailed workings of the internal combustion engine. Someone needs to know how it works, but my mother really doesn't.
I've personally seen people click through the most alarming-looking certificate errors and start entering their bank login credentials. I've driven in cars with people who had put up with that knocking sound in the engine for months because it wasn't getting any worse. If your mother did any of these things, I would be very worried about her financial and physical safety.
You don't have to know how either works to recognise something is wrong with them. My mother, happily, is quite cautious and would go and find someone who understands the machine better. In the computer case, I'd likely be getting a phone call.
The certificate errors are kind of a design failure. Users are so used to clicking through messages they don't really understand, and this is just one more. Even the most alarming certificate errors are usually innocuous, in my experience - someone forgets to renew a certificate, or fails to get a new one for a domain change. That doesn't mean it's OK to ignore them, but we really need systems that don't cry wolf unless there's a really good chance that there's a wolf.
> Even the most alarming certificate errors are usually innocuous, in my experience - someone forgets to renew a certificate, or fails to get a new one for a domain change. That doesn't mean it's OK to ignore them, but we really need systems that don't cry wolf unless there's a really good chance that there's a wolf.
How do you propose to algorithmically determine if a certificate not matching a domain change is innocuous? This is a bit like suggesting having a "check engine" light that only comes on when there's an immediate risk of an engine fire.
For starters, there are a whole load of conditions that are no less secure than a standard http connection, such as self-signed certificates. But in those cases, the browser throws up a huge and misleading warning about it being insecure. Insecure is the default on the web: we should focus on positive signs that a site is secure.
Secondly, the system should make it harder for the administrator to make mistakes. For instance, the web server could refuse to serve https on a domain that it didn't have a certificate for. Or when a certificate nears its expiry date, the administrator should be getting plenty of reminders about it.
Thinking bigger, what if we tied it closer to the sensitive UI? What if only pages loaded securely could show a password field? What if rather than typing credit card numbers into boxes on a webpage, we were used to using a special browser interface that would only light up on secure pages?
"Multi-level insecure operating systems may have special levels for
attack programs; the evil bit MUST be set by default on packets
emanating from programs running at such levels. However, the system
MAY provide an API to allow it to be cleared for non-malicious
activity by users who normally engage in attack behavior."
This should say that the system MUST provide such an api. Otherwise, a you can find a situation where the reciever may have to allow evil packets to pass, because they are sent by a user who has no option of setting the evil bit to 0 for non-malicious activity. Kind of defeats the point.
"Fragments that by themselves are dangerous MUST have the evil bit
set. If a packet with the evil bit set is fragmented by an
intermediate router and the fragments themselves are not dangerous,
the evil bit MUST be cleared in the fragments, and MUST be turned
back on in the reassembled packet."
This places an undue burden on intermediate routers, as they now have to parse the packets and determine if a fragment is evil.
"Intermediate systems are sometimes used to launder attack
connections. Packets to such systems that are intended to be relayed
to a target SHOULD have the evil bit set."
These packets are not 'evil' in any technical sense. They are only a request for another computer to send evil packets.
"In networks protected by firewalls, it is axiomatic that all
attackers are on the outside of the firewall. Therefore, hosts
inside the firewall MUST NOT set the evil bit on any packets."
This is just stupid. If I am attempting to send malicious packets from within the firewall, I am already required to set the evil bit. Now I am also required not to. If I am sending a non malicious packet, I am already required to set the evil bit to 0.
Also, If I am on a computer behind a firewall, and attempt to send malicious packets to a computer outside of my firewall, I am still forbidden by this to set the evil bit. This will force attackers to either, break the spec, or give up their own firewall.
"Because NAT [RFC3022] boxes modify packets, they SHOULD set the evil
bit on such packets. "Transparent" http and email proxies SHOULD set
the evil bit on their reply packets to the innocent client host."
I must be mis-understanding this one, becuase non of those uses seems evil.
"Devices such as firewalls MUST drop all inbound packets that have the
evil bit set. Packets with the evil bit off MUST NOT be dropped.
Dropped packets SHOULD be noted in the appropriate MIB variable."
Dropping packets is a technichal inevitability, and a crucial part of TCP's anti-congestion stragety.
Just becuase the evil bit sounds like a good idea, does not mean we should go ahead and implement it without carfully looking at the technical implecations of the spec.
The problem with the car analogy, as TallGuyShort points out, is that cars are not capable of having as strong an influence over our lives as computers are. When I think about how this used to be done before, usually finances were done over phone where people got to talk to other people who explained to them things that they did not understand.
I agree that learning about SSL is probably overkill; but we do need more computer literacy among the general population. Think of it like this: finance is something that is hard, yet people learn about the basics so that they can keep their money safe. Basic computer literacy should have a similar status as a skill that is required to, say, keep your information safe.
I think the analogy is better than you are giving credit. I would say cars can in fact have much stronger influences in our lives than online banking can. For example, cars are much more likely to kill you. It's a much more important skill to recognize when your brake pedal is getting squishy so that it doesn't fail when you are careening down a mountain road, or recognize that an oncoming car is swerving erratically, than it is to realize you are logging in to an insecure phishing site. In the former case you end up dead, in the latter case your checking account gets wiped.
To clarify, when I say "learn about SSL", I mean people should know how to tell if the connection is "secure" or not, and that secure ensures that there is an extremely high probability that the domain they are talking to is who they say they are, as verified by some people that Google / Microsoft / Mozilla decided were trustworthy. Similarly, I think people should understand that emails can travel unencrypted between sites and are easily spoofed. I'm not saying they have a working knowledge of the RFCs for SSL and SMTP. I agree with the way you worded it - perhaps I made it sound too involved.
edit: another, perhaps more humorous example. How many times have you heard about people "hacking" their Facebook accounts and getting "viruses" on Facebook. I've been asked by multiple people to fix their computers, when the real problem was they had no idea what they were doing on the Internet.