
The Unix-Haters Handbook (1994) [pdf] - arpa
http://web.mit.edu/~simsong/www/ugh.pdf
======
dsmithatx
When I read this 20 years ago I would never have believed I'd be typing this
on a Mac laptop running yet another Unix variant. This line is now so funny...

As for me? I switched to the Mac. No more grep, no more piping, no more SED
scripts. Just a simple, elegant life: “Your application has unexpectedly quit
due to error number –1. OK?”

~~~
dahart
Ha! I switched to Mac so I could have pipes, grep and bash. I personally
believe a big reason for the Mac resurgence has to do with the switch to UNIX.
I was on Windows using Cygwin for years before anyone I knew was using a Mac.
Then OSX came out, and all of a sudden all the academics I knew switched to
Mac, and a couple years later most of the professional programmers I knew had
switched.

The great thing about a Mac is that I get to have my user level UNIX without
having to know anything about the system level UNIX. I don't need to be a
sysadmin to run the thing. I get grep and pipes without having to know all the
crazy commands to tweak networks settings and display timings and boot
scripts. When OSX came out, that was the state of Linux, you couldn't be just
a user of it, you had to know way too much to get it running.

~~~
andrewbinstock
> I personally believe a big reason for the Mac resurgence

Mac resurgence is more of a perception than reality. In 2016, Apple sold
~18.5M Macs, down from the previous two years [1] and off more than 10% from
last year.

Granted that the Mac numbers are falling at a slower pace than the PC market
as a whole, but resurgence is probably an over-characterization.

[1] [http://finance.yahoo.com/news/state-apples-device-
sales-1940...](http://finance.yahoo.com/news/state-apples-device-
sales-194040374.html)

~~~
dahart
The other comments here are absolutely correct -- I was referring to OS9 vs
OSX, on a 10-20 year time frame. OSX ushered in new waves of adoption that
outshines previous numbers by an order of magnitude.

Given that Mac has gone from well under half a percent market share to over 7%
today, I'd say "resurgence" is an under-characterization, relative to what I
was talking about. ;) OS9 never had the market share that OSX has, so it's
less coming back, and more dominating like it never did before.

------
Philipp__
I like to cover myself with some crazy question and think about them. For
example, what would look like the thing that would make people go "oh this
looks good enough to replace UNIX".

Don't get me wrong, I am big fan of UNIX, but I hope I will be alive around
the time(but I doubt that) when we will see some new thing which will make
UNIX feel dated. Now, some of you might jump and say "Oh, but UNIX already
feels dated", and that would make conversation on it's own, but I think people
say that more because they are bored with UNIX, or they dislike certain
segments.

And what breaks my heart is general disinterest in Operating systems with
young developers/students (I am student too, but I find those things to be
most interesting of all university courses). I see very little people doing OS
work today. I wasn't there to see how it was in 80s & 90s, but from what I've
read, you had much, much more choice, but their quality was debatable. Why we
always consider Operating Systems to be "solved" thing? Is it because the way
Von Neumann architecture works, we tend to abstract computer as an entity in
the way that gave us UNIX and that we won't be able to discover and make
something different but as capable as UNIX without changing our way of
thinking about what computer is and how it works? Did we got used to computers
as they are, especially newer generations, taking things for granted, and just
going forward with what they inherited?

~~~
gingerbread-man
Google's Fuschia OS project is intriguing. It's open source, but I haven't
been able to find any whitepapers or conference talks about its design or
Google's plans for it. The speculation is that they wanted an ultra-
lightweight OS for future low-latency augmented-reality applications.

([https://news.ycombinator.com/item?id=12271354](https://news.ycombinator.com/item?id=12271354))

~~~
magentaboy
For decades, I have challenged people with a dollar bet tha they cannot
correctly spell "fuchsia" given five tries.

Everybody thinks they can spell it, but I have never lost the dollar.
Sometimes I pull out the same bet six months later and still win it.

~~~
zrth
Don't bet against German speaking persons. "Fuchs" mean "fox" in German so
fuchsia basically means/sounds like fox-ia in German, guessing they will be
able to spell it correctly as it is to easy to miss spell.

------
open-source-ux
It's interesting that no one blinks twice at the thought of using a decades-
old operating system - it's "mature", "battle tested", "proven over time" or
some other phrase.

On the other hand, programming languages that are decades old are looked upon
as antiquated and not fit for modern problems, so we create new languages and
discard the old ones, including forgetting useful ideas those languages
contained. Sometimes, we come back to some of those ideas. So a dynamic
language decides that types are in fact quite useful, and a single exe file is
so much simpler to distribute than endless tiny script files, and maybe speed
does matter after all.

Not passing judgement, just saying that's how it is. And it is rather odd
isn't it?

~~~
tbrake
I think most programming languages that are decades old aren't looked at as
'not fit for modern problems' so much as 'inefficient for modern*
development'.

Is it possible to do the same things in C that we can in Ruby, PHP or Node?
Absolutely. But can it be written in the same time frame?

While I'm sure there are HNers out there who can - or who at least claim they
can ;) - let's ask ourselves if that's really broadly truthful outside of
trivial example applications. I'd wager 'no'.

Those higher level languages might 'forget' something in the past and
'rediscover' it but I feel like that's kind of a very minor sub-plot in the
larger story of the usefulness of their abstraction and ability to represent
more powerful ideas in less lines of code.

* - read 'web'. Old and low level languages are still obviously used in a wide variety of projects and industries.

~~~
di4na
Meanwhile, erlang...

------
raverbashing
Unix is weird because it evolved organically and without a unified direction.
But it remains because power and familiarity beat user experience.

Yes, the "pure" Unix tools are awful, GNU improved on their usability a lot.
But they're still a simple command that does something.

Except Autotools. Those should burn in eternal damnation.

~~~
mi100hael
"Autotools is the worst form of build system, except for all those other forms
that have been tried from time to time." -Churchill, probably

In all seriousness, what's your preferred alternative? Seems like Autotools is
a pain, but it gets the job done and is widely available. I haven't found a
build system for C projects that is:

\- Less complicated

\- Available from default package repos so others who clone don't have to
track down some esoteric package themselves

\- Free software that runs standalone on a shell, not in an IDE or whatever

~~~
jff
You should be able to build well-written code on a variety of modern systems
using nothing more than a Makefile, or even just a shell script. See plan9port
for an example of the latter.

~~~
ori_b
Plan9port shops with, and uses, mk.

~~~
jff
You're right, but first the INSTALL script makes a few customizations based on
what OS you're running (Solaris, OS X, Linux, *BSD), then bootstraps mk and
finally runs mk on all the application source.

The script takes into account differences in operating systems, without
including thousands of lines of weird little leftovers from 30 years ago when
you needed to check if you were actually running on a 16-bit Data General
machine or whatever (I'm looking at you, autoconf)

------
dwheeler
Funny book, I have it in my hands (with the barf bag!). The anti-forward by
Dennis Ritchie is awesome all by itself.

Some of it is long obsolete. For example, Usenet/NNTP is rarely used today.
And many of the specific implementation problems they mention are long-fixed
on modern systems.

Some of the problems they note are still valid. Some of the challenges of
dealing with filenames are absolutely still true; because filenames are
sequences of bytes (NOT sequences of characters), and allow stuff like leading
dash, control characters, and non-characters, you can have a lot of problems.
See the stuff on page 168-171. I talked about this in
[https://www.dwheeler.com/essays/fixing-unix-linux-
filenames....](https://www.dwheeler.com/essays/fixing-unix-linux-
filenames.html) and [https://www.dwheeler.com/essays/filenames-in-
shell.html](https://www.dwheeler.com/essays/filenames-in-shell.html)

That said, there are _reasons_ that Unix-like systems took over everything.
Many of their complaints are from lack of consistency. But some loss of
consistency is inevitable when you have a marketplace of many people with many
ideas. Many of the other systems they remember fondly were often tightly
controlled by a small number of people - they were more consistent, sure, but
they were also slow to implement newer ideas. When you're running a race, the
one who runs faster usually wins.

~~~
pvg
Stop saying that! It's a foreword. Like 'foreplay' but with 'word'.

------
omginternets
While entertaining, I'm left with the following question: if not UNIX, then
what?

Are there any successful non-UNIX-y OSes that are worth checking out?

I may embarrass myself here, but I was under the impression that BSD, Plan 9,
Solaris and HP-UX were all UNIX-y ...

~~~
grabcocque
Basically only two paradigms survived the 90s. The Unix way (via Linux, BSD,
Darwin) and the VMS way (via Windows NT). Everything else is either ultra-
niche, dying, a mainframe so ancient and terrifying nobody will go near it, or
only of marginal historical import.

Bear in mind that when this book was written, the average desktop PC was
running MS-DOS, and possibly Windows 3.1 if it was new and powerful enough.

What UNIX is being compared to is the other mainframe OSes of the 1980s, from
DEC's VAX/VMS to IBM's OS/360 and OS/400

~~~
pjmlp
The Xerox way also influenced Windows, OS X, iOS and Android and to certain
extent ChromeOS.

In regards to IDEs, frameworks, programming languages culture.

~~~
waz0wski
There's a good book about this, called 'Dealers of Lightning: Xerox PARC and
the Dawn of the Computer Age'

[http://a.co/9Gp3tQ7](http://a.co/9Gp3tQ7)

------
krylon
I think that quite a bit in this book is outdated by now, but it's still a fun
read, even as Unix person.

Especially the chapter on NFS was a fun read. When I first set up my home
server, I had my desktop lock up completely a few times when the server became
unavailable, because of NFS. (Then I found out one can mount NFS shares in
interruptible mode so the requests will eventually time out... but it was so
annoying up to that point.)

~~~
disconnected
Many parts of the book are outdated, and some parts are just pedantic and
outright trollish, but other parts still painfully up to date.

For example, mandatory file locking. I bumped into this in a project once.

We wanted to prevent concurrent read/write access to a particular file (a
serial device, actually). Unfortunately there was no simple and reliable way
to do this on Linux.

The "traditional way" of doing this is via lock files: you create a file
"somewhere", and other applications are supposed to check if this file exists
before operating on the locked file.

But you have three problems here: 1) the location of the lock files is not
standard 2) there are race conditions with checking for the existence of the
lock files and creating lock files and, most critically, 3) applications can
still ignore the lock file at their leisure.

There is also flock() (not mentioned in the book, IIRC, I think it was added
to Linux at a later date). But it still blows because it only solves (1) and
(2). You can flock() all you want, if applications don't care for it, they can
still do whatever they want.

We just ended up accepting our losses, used flock() for our applications, and
hoped for the best.

------
andai
Any thoughts on UNIX security? I've been wondering what the most secure OS is
for a long time, and the answer seems to be "systems that stopped being
developed before you were born."

~~~
Ajedi32
Android, iOS, and Chrome OS all have security models that are fundamentally
superior to what you get on a base *nix system: Fine-grained, application-
level permissions which the user can grant or deny on a case-by-case basis.

And before you say that all those operating systems are based on Unix; that's
true but it's mostly just an implementation detail. There's no reason, for
example, that Chrome OS couldn't be based on an entirely different kernel and
still have the same permissions model, and most Unix-like systems don't have
fine grained permissions like this out-of-the-box.

~~~
FroggyMan
Windows Phone 7 did fine grained security much earlier and with full device
encryption as default.

It's now also a standard on WinRT/UWP on Windows8/10.

Android was very late to the party regarding fine grained security. Google
knowingly put billions of consumers at risk for a long time. And even today
its security a nightmare.

------
shakencrew
Previous discussion on Hacker News:
[https://news.ycombinator.com/item?id=7726115](https://news.ycombinator.com/item?id=7726115)

~~~
dang
The current post is also a follow-up to
[https://news.ycombinator.com/item?id=13777077](https://news.ycombinator.com/item?id=13777077)
from yesterday.

------
bb88
I just pulled the book off my shelf and it still has the UNIX barf bag in the
back.

[https://goo.gl/photos/gHvGwHpC3z5KxLv59](https://goo.gl/photos/gHvGwHpC3z5KxLv59)

(edited link so that people don't need to login to google to see it.)

------
fnordfnordfnord
see also: The Unix haters' UNIXUX server. It's pretty funny.
[http://www.art.net/~hopkins/Don/unix-
haters/login.html](http://www.art.net/~hopkins/Don/unix-haters/login.html)

~~~
DonHopkins
It was more realistic back when the <BLINK> tag worked:

    
    
        <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2//EN">
        <HTML>
        <HEAD>
           <TITLE>UNIXUX: Click on the cursor.</TITLE>
           <META NAME="Author" CONTENT="Don Hopkins">
           <META NAME="GENERATOR" CONTENT="User-Agent: Mozilla/3.0Gold (Macintosh; I; PPC)">
        </HEAD>
        <BODY TEXT="#7CFE5A" BGCOLOR="#191919" LINK="#FF0000" VLINK="#FF3333" ALINK="#00FEFA">
        <P><B><TT><FONT SIZE=+1>UNIX HATERS Release 2.0 (unixux)</FONT></TT></B></P>
        <P><B><TT><FONT SIZE=+1>login: <BLINK><A HREF="password.html">_</A></BLINK></FONT></TT></B></P>
        </BODY>
        </HTML>

~~~
jcl
CSS to the rescue! :)
[https://jsfiddle.net/54bpngL3/](https://jsfiddle.net/54bpngL3/)

------
ebbv
I remember when this was new and I've never found it as amusing as other
people seem to. It's super easy to shit on software that's almost 50 years
old. But where's the replacement?

Windows is still worse as a web server platform than various Unix descendants
(Linux and BSD) despite Microsoft's best efforts, and it's not free.

Unix has its flaws but being a "Unix hater" is just dumb.

~~~
Gracana
UHH absolutely does not attack UNIX for being old. See "Who We Are" in the
preface. The authors worked with systems they felt were superior in many ways,
until industry/economic forces pushed them to UNIX.

"This book is about people who are in abusive relationships with UNIX, woven
around the threads in the UNIX-HATERS mailing list."

They're not a bunch of angry outsiders, they're UNIX insiders who reject the
idea of "Worse is Better." Worse is Worse, but we're stuck with it.

~~~
xorblurb
Worse is Worse negates the aspects that purity in the vacuum is very pure and
very useless. There is (extreme) value in network effects, and network effects
also exists in technology, not only in social networks... (and well, in
technical fields, both are related). Also it's extremely easy to keep only
good memories of extinct systems, and remain focused on past defaults, that
for the most part have been largely corrected or at least compensated by dozen
of years of refinement on still alive systems.

I don't negate that there are some technical choices that are better than
others in some context, or even in some cases in every aspects. But e.g. the
PC loser-ing problem has in practice hardly caused much problems (and is even
in some cases a superior approach even for the userspace), and judging systems
by that kind of details can easily be compared to an hypothetical crazy
mechanical engineer not buying a car because of an obscur mechanical technical
detail in the engine, that has been made to work perfectly well by taking into
account its peculiarities, but that he dislike by principle and in an absolute
way.

I'm also very aware of the enormously costly impact of some de facto industry
choices. For ex: the C language and its derivative are shit and have arguably
costed an absurd amount to the human kind -- compared to a technological-
fiction world where safer languages would have been used. Does that mean that
this is a case of "Worse is better"? Maybe, but then what would it mean? We
have processors with MMU, we even have IOMMU that are becoming more and more
common. Given some existing software stacks this does not need to be an
intrinsically requirement -- but IMO this is vastly better. Can Worse
sometimes win, and then Better also win on other subjects? What should we
deduce from that then? What insight can we extract? If none, this whole
categorization would be meaningless.

So in some cases, "Worse" is actually just better, the "worse" in question
only being in the eyes of the opponent. Or you need to be extremely precise
about the criterion you are using.

But I prefer to stick to consider all the advantages and drawbacks of what I'm
talking about, the situation I want to apply it to, and avoid binary
categorizations when they have no predictive value.

------
valarauca1
This is a good read. Especially if you use Linux/BSD/OSX on a fairly regular
basis.

While it is mostly long usenet rants about the _failure_ of Unix. There is
some grains of truth buried in 80's-esque proto-shit-posting.

------
kowdermeister
I've just found this subreddit looking for what else if not UNIX:

[https://www.reddit.com/r/EsotericOS/](https://www.reddit.com/r/EsotericOS/)

------
mcguire
" _We have tried to avoid paragraph-length footnotes in this book, but X has
defeated us by switching the meaning of client and server. In all other client
/server relation- ships, the server is the remote machine that runs the
application (i.e., the server pro- vides services, such a database service or
computation service). For some perverse reason that’s better left to the
imagination, X insists on calling the program running on the remote machine
“the client.” This program displays its windows on the “window server.” We’re
going to follow X terminology when discussing graphical client/servers. So
when you see “client” think “the remote machine where the appli- cation is
running,” and when you see “server” think “the local machine that dis- plays
output and accepts user input.”_"

Yes, Garfinkel, et al., don't know what a "client" and "server" are.

~~~
DonHopkins
I'm the "et al." who wrote that chapter, and I don't understand what you're
trying to say.

What do you think "client" and "server" mean, and what do you think is wrong
with that paragraph-length footnote (besides its length)?

Could you please try to rewrite it more accurately or in fewer words?

~~~
mcguire
Hi, Al!

The X use of "client" and "server" is correct. The "remote machine"/"local
machine" thing isn't.

See my other reply:
[https://news.ycombinator.com/item?id=13783802](https://news.ycombinator.com/item?id=13783802)

Edit: Forgot to refer to you as Al. (Dang it, this is why I can't have nice
jokes.) Apologies!

------
moomin
I love this book. Yes, it's now out of date, but it's a) still funny and b) is
a great book for learning to think about the design of systems.

The second point is sadly neglected. If you read it and your principal
reaction is "Unix isn't like that anymore" you've completely missed the point.

------
INTPenis
Hate all you want but I regularly have moments where I solve something using a
string of commands and think to myself; gosh darn I love Unix!

It's not really Unix, sometimes it's Macintosh with GNU tools installed but
mostly it's CentOS or Debian servers where all this magic takes place.

Sure I need to have a lot of stuff in that thick head of mine to do magic, but
the joy of stringing together a series of commands to solve a problem my
windows using co-workers would have to spend license money to solve is just
amazing.

Hating Linux is definitely right if you're looking from the perspective of
universal user friendliness. But for true power users, the programmers, the
scientists and the sysadmins; I can't picture life without it.

------
_pmf_
The question is: what would the ideal non-POSIX API look like?

------
gingerbread-man
For all of the usability flaws pointed out here, to what would you ascribe
Unix's proliferation and modern ubiquity?

\- Giving it away to universities?

\- The C programming language?

What am I missing?

~~~
falsedan
The common explanation is "worse is better" (see _The Rise of Worse is Better_
[0])

[0]:
[http://dreamsongs.com/RiseOfWorseIsBetter.html](http://dreamsongs.com/RiseOfWorseIsBetter.html)

~~~
jasonsync
tldr.

The lesson to be learned from this is that it is often undesirable to go for
the right thing first. It is better to get half of the right thing available
so that it spreads like a virus. Once people are hooked on it, take the time
to improve it to 90% of the right thing.

------
gerbilly
My copy came with a barf bag.

------
verifex
Maybe there really wasn't such a thing as a "good" operating system long ago,
just one that you could tolerate more than another. I remember using OS/2 Warp
as a kid and thinking it was pretty neat. Although I couldn't really do much
with it when I was a kid, since I wasn't interested in writing my own apps in
OS/2.

------
appleflaxen
can anyone give a first-hand historical perspective on this, regarding what
was going on and how this was seen in 1994?

~~~
ddebernardy
It was written in an age where there were still plenty of computer options to
go around with.

On the consumer and small business end, there were DOS/Windows boxes,
WindowsNT boxes, and Macs, of course. But OS/2 was still a thing, Win95 was
around the corner, and, while on the decline by 1994, Atari and Amiga devices
were still somewhat sensible options for e.g. music or gaming a few years
earlier.

On the corporate end, and more saliently for this specific book, you still had
plenty of mainframes and such around, and an equally rich variety of options.
Some of these computers were called Lisp machines, and built around Lisp
rather than Unix/C. They had started to lose traction after the onset of AI
Winter:

[https://en.wikipedia.org/wiki/Lisp_machine](https://en.wikipedia.org/wiki/Lisp_machine)

[https://en.wikipedia.org/wiki/AI_winter](https://en.wikipedia.org/wiki/AI_winter)

The authors of the book basically are Lisp aficionados who mourn the good old
days when they could get by using elegant Lisp machines instead of clunky,
user hostile Unix boxes.

The book itself is an entertaining read insofar as I can remember. (My memory
might be playing tricks on me though; I read it 10-15 years ago.)

The authors obviously cherry pick to maximize Unix bashing. The authors make a
number of very valid points - some of which still apply today. The tone is
usually playful. But then, at times, contempt with a pinch of self-importance
emerges, and you're left wondering why you're bothering to continue reading a
300+ page long rant... only to realize that the points made are so obnoxiously
valid (or, at least, were when I read it) that you continue.

Dennis Ritchie's anti-forward passage return the favor, too:

> Your book is a pudding stuffed with apposite observations, many well-
> conceived. Like excrement, it contains enough undigested nuggets of
> nutrition to sustain life for some. But it is not a tasty pie: it reeks too
> much of contempt and of envy.

~~~
appleflaxen
I saw the anti-forward, but missed that quote. Thanks for pointing it out, and
for the thoughtful blurb. Great context!

------
fdupoo
Make sure to read the preface. It is the best preface I've ever read. Pay
special attention to who wrote it.

~~~
scandox
I cannot find a distinct credit for the Preface. Where is it?

~~~
falsedan
I assume the OP was referring to the Anti-Foreword by Dennis Ritchie.

------
ryanmarsh
OMG the anti-foreword by Dennis Ritchie. Pure gold.

~~~
pawadu
_" Your judgments are not keen, they are intoxicated by metaphor. In the
Preface you suffer first from heat, lice, and malnourishment, then become
prisoners in a Gulag."_

Harsh words from Dennis, RIP.

------
kensai
I have to admit, I laughed at this: "C++ Is to C as Lung Cancer Is to Lung" :D

------
0x264
This book is one of my top three books on how to become a good software dev :)

~~~
zingplex
Could you name the other two?

~~~
fao_
They're for good developers to know and every one else to find out /jk

Seriously though, I think almost everyone has a pick of different books that
influenced them the most and helped them to become the best developer that
they could be, and while most of the time they are contained within a single
set of books, it's kind of fruitless to buy books you might not find
interesting with the promise of "They will make you a good developer". I have
a fair amount of books I find difficult to read because of the style, etc.
that I bought with that promise, and none of them improved my programming
ability significantly more than a bunch of coding + an internship has.

[What I would recommend is finding authors that you like reading andor tend to
get a lot from]

------
grabcocque
If I ever need to feel good about myself as a software developer, I only ever
need to read this book's chapter about the X window system.

~~~
tragomaskhalos
An office I worked in in the early 90's had a shelf with a ten(?) volume set
of X windows books - and each individual book was a thick bugger. I'd only had
fleeting experience of it, but remember thinking "how complex can this thing
_be_ ?!"

~~~
pmontra
I remember them. Those were both the guides and the references for the whole
X11 ecosystem. I guess many modern pieces of software would be that large if
properly typesetted and printed on paper. Luckily we fit them on the web and
some thousands of blog posts, stackoverflow and more.

