
The truth about Unix: The user interface is horrid (1981) [pdf] - adamnemecek
http://www.ceri.memphis.edu/people/smalley/ESCI7205F2009/misc_files/The_truth_about_Unix_cleaned.pdf
======
ColinDabritz
This is a specific critique about the usability of the Unix shell, and I think
most of this is still valid. The inside jokes (less is more) and overloaded
meanings of command names is still an issue today, for example. Some of the
visibility has improved by various means, but overall there is still a fair
amount of 'black box' that takes arcane knowledge to peek in to.

Naming is one area I appreciated Microsoft's approach in Powershell. They have
command names with a Verb-Noun structure, and full clear words, sometimes
quite long. Then, after establishing clear canonical forms for the commands,
they add a few well chosen aliases for short invocation and memorization.

Of all the critiques though, the Cognitive Engineering objection, that the
system is not well designed to be used by human capacities, is still true
across many platforms, especially in esoteric areas like the command line.

Also interestingly, I am guessing that this is the same Donald A. Norman of
'The Design Of Everyday Things'. It's fascinating to see these ideas in flight
in 1981.

How do we address these things? How does one "redesign Unix" today?

~~~
integraton
_> ...Powershell. They have command names..._

Except that PowerShell commands, 'cmdlets', are really just .NET classes that
run within PowerShell, like Ruby and Python classes, none of which are the
same as working with GNU or BSD utilities and other executables in POSIX
shells. That PowerShell's interface obfuscates its true nature is a huge
violation of honest design, a massive violation of what's described in "Design
of Everyday Things," leads to false comparisons, and misleads people into
thinking it is providing the same functionality as working with executables in
POSIX shells when it's actually providing chaining functionality like that in
other scripting languages.

~~~
UnoriginalGuy
> Except that PowerShell commands, 'cmdlets', are really just .NET classes
> that run within PowerShell, like Ruby and Python classes, none of which are
> the same as working with GNU or BSD utilities and other executables in POSIX
> shells.

That's categorically incorrect and shows a lack of knowledge of both
Powershell and .Net classes. But instead of me showing you that you're wrong,
let me teach you how to prove to yourself that you're wrong.

Open up Powershell, type in

    
    
         ( get-command get-date ).dll     
    

This will find the dll on your system for the get-date cmdlet, but any will do
(
C:\Windows\Microsoft.NET\assembly\GAC_MSIL\Microsoft.PowerShell.Commands.Utility
on my box). Now spin up ILSpy or any .Net decompiler.

Let's look at the GetDateCommand class. That's a 400 line class, which
according to you, shouldn't exist (as cmdlets are "really just .Net classes").
In fact this entire DLL shouldn't exist.

But GetDateCommand is one of PS's simplest commands since it wraps DateTime
(in CorLib), so "wait! See!" you say. But what you need to understand is that
Powershell is built on top of .Net, .Net is effectively Powershell cmdlet's
"kernel" so just like UNIX commands communicate with the actual kernel,
cmdlets are going to leverage functionality in their "kernel" (.Net whenever
possible).

My point is, that no, Powershell cmdlets are NOT just .Net classes. However
you CAN use .Net classes directly in Powershell. For example:

    
    
         [System.DateTime]::Now

~~~
integraton
_> cmdlets are NOT just .Net classes_

Yes they are.

The fact that you (and others I've seen here and elsewhere) continue to
believe otherwise further demonstrates how deceptive PowerShell's design is.

What's interesting is that the cmdlet documentation is very clear:

 _" Cmdlets differ from commands in other command-shell environments in the
following ways:_

 _" Cmdlets are instances of .NET Framework classes; they are not stand-alone
executables...._

" _Cmdlets do not generally do their own parsing, error presentation, or
output formatting. Parsing, error presentation, and output formatting are
handled by the Windows PowerShell runtime._ "

[https://msdn.microsoft.com/en-
us/library/ms714395%28v=vs.85%...](https://msdn.microsoft.com/en-
us/library/ms714395%28v=vs.85%29.aspx)

~~~
UnoriginalGuy
> Yes they are.

I literally just spoon fed you exactly how to go look for yourself about how
cmdlets work and how they're distinct from the .Net classes they represent. I
honestly don't know what more I can do.

> The fact that you (and others I've seen here and elsewhere) continue to
> believe otherwise further demonstrates how deceptive PowerShell's design is.

You realise I understand how Powershell works top to bottom, right? Where are
they "deceiving me?" You can yourself can go learn about Powershell's
artitecture (you have the tools, I've given them to you, and you clearly have
access to the documentation).

> What's interesting is that the cmdlet documentation is very clear:

Wait is your biggest issue that cmdlets are held within DLLs of classes
instead of standalone binaries? Because based on the snippets you posted I can
only assume that is what your issue is.

Wait, but hold on, how many internal commands do most UNIX shells have?
Dozens? Hundreds? Here's a list of them for bash:
[http://www.gnu.org/software/bash/manual/html_node/Bash-
Built...](http://www.gnu.org/software/bash/manual/html_node/Bash-
Builtins.html)

So why one rule for UNIX and another for Powershell? Why does it matter that
you can store multiple commands (cmdlets) inside of a single DLL?

~~~
weland
I'm only passingly familiar with PowerShell but really, it's Microsoft, in the
bloody documentation, that _literally_ states precisely that cmdlets are
instances of .NET Framework classes:

"Cmdlets differ from commands in other command-shell environments in the
following ways:

 __Cmdlets are instances of .NET Framework classes __; they are not stand-
alone executables.

[https://msdn.microsoft.com/en-
us/library/ms714395%28v=vs.85%...](https://msdn.microsoft.com/en-
us/library/ms714395%28v=vs.85%29.aspx)

We can have an ontological debate about how an instance of an-object-as-a-
concept is different from an instance of an-object-as-memory-content but I
think that's best left to amateur philosophers.

------
userbinator
I saw the majority of his complaints as basically "it's too hard/unfamiliar",
a sentiment which I don't really agree with as I've noticed the steepness of
learning curve of various software tends to be correlated with how powerful it
is - those who give up early might not see that.

Inconsistent command names shouldn't be a difficulty - shell commands are like
any other language, whose vocabulary is quickly learned with repeated use.

 _The manual doesn 't bother warning against this either, although it does
warn of another, related infelicity: "Beware of 'cat a b > a' and 'cat b a >
a', which destroy the input files before reading them."_

I'd consider it generous to see a mention of shell redirection in cat's
manual, since that's done before it ever gets executed; in general,
interactions between commands can't be exhaustively documented because they
are numerous, and Unix assumes you can put two and two together.

..or maybe I'm just old and accustomed to it, after having used various OSs
over the years, but that's sort of the point: Thanks to this "lowest-common-
denominator" type of interface design, ease-of-use seems to have massively
replaced learning, dissuading and distancing users from having control of
their machines at a time when such control is becoming increasingly important.

~~~
lukeschlather
The result of `cat a b > a` couldn't possibly be the result that the user
intends, so it seems like poor design by definition. I can see how it would be
tricky to handle it properly, but I do think it's possible to handle it
properly without compromising the expressiveness of the shell.

~~~
pdkl95
In modern bash, this case is already protected with the "noclobber" option
(enabled with "set -C" or "set -o noclobber"). It is described in the
"REDIRECTION" section of bash(1), subsection "Redirecting Output".

It fails ">" redirections with an error message when it would clobber an
existing file (you probably wanted to append to the file wiht ">>"). If you
really intended this behavior, the new redirection operator '>|' has the old
behavior.

    
    
        $ set -o noclobber
        $ echo foo > bar
        $ cat bar
        foo
        $ cat bar > bar
        bash: bar: cannot overwrite existing file
        $ echo baz > bar
        bash: bar: cannot overwrite existing file
        $ echo baz >| bar
        $ cat bar
        baz
    

edit: fixed typo

~~~
girvo
That's a nice feature, but it should definitely be turned on as a default
rather than hidden behind an option.

~~~
pdkl95
That would break a very large number of existing scripts that depend on being
able to replace a file with a traditional ">" redirection.

You can set it as the default for your own interactive shell if you want by
adding

    
    
        set -o noclobber
    

to your ~/.bashrc file. It is probably a good idea to add it _after_ the line
that check if it is being used non-interactively, which probably looks
something like this

    
    
        if [[ $- != *i* ]] ; then
            # Shell is non-interactive
            return
        fi
        # add interactive-shell-only settings here
    

While you're editing .bashrc, you may want to also add these options that are
available in modern bash that I find make the experience a lot nicer.

    
    
        # support new glob patterns like !(foo|bar|...)
        # for example, this move everything EXCPET the .h files
        #     mv !(*.h) /some/dir/
        shopt -s extglob
    
        # support ** in globs to match zero-or-more-recursive-directories
        shopt -s globstar
    
        # if you're annoyed at having to type "nohup" to prevent
        # programs from closing when you close the shell window
        shopt -u huponexit

~~~
girvo
Right, it definitely would break a lot of existing scripts, so I understand
why its not a default. The thing is though, that really proves the argument
that some parts of Unix have a bad out of the box user experience! I
personally love Unix and find it amazingly powerful, but I've made the same
mistakes as everyone and killed entire boxes, and really with some simple UX
changes that could be fixed in the bulk of cases.

Edit: I forgot to add, thanks for the detailed answer!

------
ChuckMcM
I remember this rant! I was chuckling about how superior Tenex (and later
TOPS-20) was from UNIX and the article in Datamation really spoke to me. No
esc command completion, no typing a '?' and seeing all the possible
completions, no ^T to give you an update on what your process was currently
doing. UNIX totally sucks at all of that.

Of course once you have trained your fingers you don't think about it any
more. I missed that the first time around. This author does too of course, and
so did a huge part of the industry. So many things were written with all sorts
of discoverable features and helper stuff and the truth was that a lot of them
can't get out of your way once your fingers know what to type! You can get so
far and then you hit a wall, you have to walk through the "feature" that is no
longer useful to you as an experienced user, because there is no way to get
rid of it.

That is when you start to appreciate the minimal design. And loathe the very
things that this article holds up as grand design truths. That is the secret
of the success of the simple shell syntax.

~~~
juliangregorian
One of the features of the command line ftp client is that (most of) its
internal commands are full words; but as long as they remain unambiguous, you
can abbreviate them as much as you want. Why more systems don't adopt this
approach, I don't understand.

~~~
ChuckMcM
The down side of this is that later adding a command which forces another
letter to be typed breaks previous scripts and finger memory.

~~~
juliangregorian
It's an inconvenience, but it's hardly an exotic situation. If it breaks
backwards compatibility just save it for a major version release. It wouldn't
even do the wrong thing, just complain about a certain command being
ambiguous, and you'd have to fix it.

------
emmelaich
Trivia: _cat_ is actually an abbreviation of _catenate_ , a lesser used (now)
synonym for concatenate.

Unix is of course horrible for beginning users but if you think you can do
better (as in fixing the problems without doing collateral damage) you are
_probably_ wrong.

Another aside: The UNIX Hater's Handbook should be read by every Unix sysadmin
and programmer.

~~~
snogglethorpe
The problem with the UNIX Hater's Handbook is that it's 10% spot-on
observations about real problems with Unix(-type) systems, and 90% pointless
whining and regurgitating of ancient tribal positions. Unfortunately a lot of
the people posting complaints were playing to the crowd more than anything
else.

Great idea for a mailing list ("let off some steam! complain to like-minded
souls!"), maybe not such a good idea for a book...

~~~
stonogo
The linked article in question is the same thing. The author complains it
won't run on his PDP-11.

I beg you, however, to reconsider "not such a good idea for a book." It is a
_perfect_ idea for a book. It is precisely the kind of information that we're
going to lose when Google shuts down Groups, or when the last fidonet archive
goes offline. Books like the Unix Hater's Handbook are the only historical
context we have for the rise of UNIX, and the death of TWENEX and the LispM.

------
vezzy-fnord
cat(1)'s versatility is mostly a side effect of the essential nature of file
descriptors, which I believe is good.

Criticisms of command, standard library and FHS naming are evidently quite
ancient. I have little sympathy for them, given that other mainstream systems
are usually worse. However, one can see GoboLinux and NixOS for alternative
takes. I also will grant that creat(2) is embarrassing.

Modern versions of rm(1) (or at least GNU rm and I believe Solaris rm, as
well) have a default policy of refusing certain glob patterns without explicit
approval so as to prevent user error. I do not favor this much in principle,
but this criticism is now dated. That you aren't asked if you're sure every
time you do an operation that is potentially dangerous (which can be so many
things) can be seen as a good thing.

Configuring your shell isn't particularly elegant, though understanding
general distinctions like sessions isn't all that difficult.

ed(1) is seldom used these days, obviously. The functionality of fgrep(1) and
egrep(1) have largely been merged into more monolithic grep(1)
implementations, though the former two remain as aliases.

Overall, modern Unix-like UIs are actually pretty good as far as these things
go. It could be much better, but whenever people try to fix things they're
quite ironically met with resistance from people too conformant to the status
quo, and the better ways are inevitably relegated to either academic
curiosities or cult/heterodox practices that are espoused by a loyal few.

On the other hand, whenever a new solution _does_ make it, it's rarely truly
better because it's also based on some paradigm that is relatively well
ingrained in the computing popular culture. When someone criticizes the FHS
these days, it's almost always a concealed way of saying "Why isn't this like
the directory structure of Windows or OS X?", rather than any interest in
having the most ergonomic solution.

------
pekk
As a cautionary note, this article isn't really about anything people are
using today. "Unix" doesn't today mean anything very similar to what it meant
in 1981. The Commodore PET, Apple II and MS-DOS 1.10 of that time did not have
great user interfaces either. And a lot has happened since then.

------
ape4
Unix is the worst operating system. Except for all the others.

~~~
botnet2366
what is the most effective program for hacking..

------
redwards510
This reminds me of that great essay by Neal Stephenson called "In the
Beginning was the Command Line".[1] In it, he compares Unix to an industrial
strength screwdriver called the "Hole Hawg".

"But I never blamed the Hole Hawg; I blamed myself. The Hole Hawg is dangerous
because it does exactly what you tell it to. It is not bound by the physical
limitations that are inherent in a cheap drill, and neither is it limited by
safety interlocks that might be built into a homeowner's product by a
liability-conscious manufacturer. The danger lies not in the machine itself
but in the user's failure to envision the full consequences of the
instructions he gives to it."

[1]
[http://www.cryptonomicon.com/command.zip](http://www.cryptonomicon.com/command.zip)

~~~
sp332
Plain text version
[http://garote.bdmonkeys.net/commandline/index.html](http://garote.bdmonkeys.net/commandline/index.html)

------
alexbecker
> [Casual users] are apt to expect more intelligence from the system than the
> designer knows is there.

A million-fold increase in processing power later, and this is still one of
the biggest problems for novice programmers or others just starting to look
under the hood.

------
mjcohen
Looks like a predecessor of "The Unix-haters'andbook":

[http://web.mit.edu/~simsong/www/ugh.pdf](http://web.mit.edu/~simsong/www/ugh.pdf)

Actually, I learned a lot about Unix when I read this in the 1980's.

------
verelo
The reality is Unix isn't easy to get started in, but there are lots of
resources once you know how to find them.

I learned by having a close friend and co-worker guide me through the early
stages of using Unix, eventually I knew how to find solutions to my own
problems (it was more a matter of where to look and some basic terminology).

The different flavours of Unix make this a bit of a double edge sword in that
while solutions are similar on most platforms, finding the location of config
files and different ways to install packages is often well explained but not
always for the platform you're working with.

------
emmelaich
Contribute to _libexplain_ if you want to help. Created by the late Peter
Miller.

[http://libexplain.sourceforge.net/](http://libexplain.sourceforge.net/)

------
thrownaway122
And still today cut uses -d for deliminator and awk uses -F (and they better
not change this otherwise many of my scripts will break)...

Is there a logic behind how unix utilities' options work? Other than backwards
compatibility.

------
wz1000
Question: If you had to redesign Unix from scratch(ignoring hardware
limitations), how would you do it?

Personally, I feel an ELisp like approach would be much better as a way for
interacting with computers.

------
jiballer
I just find it funny that the favicon is the old Sun Microsystems logo.

~~~
justincormack
You see quite a few of those around the old corners of the web.

------
MichaelCrawford
$ cp foo /real/long/path/name/to/destination/
/real/long/path/name/to/destination/: no such file or directory

Why doesn't it tell me which component was wrong?

~~~
pwg
What version of cp is this?

Because I get this:

    
    
        $ cp foo /real/long/path/name/to/destination/ /real/long/path/name/to/destination/
        cp: target '/real/long/path/name/to/destination/' is not a directory
    

With this version:

    
    
        $ cp --version 
        cp (GNU coreutils) 8.21
        ...

~~~
heinrich5991
They complained that it doesn't show

    
    
        cp: target '/real/' is not a directory

