
A Requiem for a Dying Operating System (1994) - aphrax
https://user.eng.umd.edu/~blj/funny/requium.html
======
reagent_finder
Pretty much each point raised in this post(?) are correct, current and
relevant even 26 years later.

POSIX is a monolith and really deserves to be improved. It's been around
forever, yes. It will probably keep on being around forever, yes.

    
    
        Take the tar command (please!), which is already a nightmare where lower-case `a' means "check first" and upper-case `A' means "delete all my disk files without asking" (or something like that --- I may not have the details exactly right). In some versions of tar these meanings are reversed. This is a virtue?
    

Raise your hand if you've never broken Grep because the flags you gave it
didn't work. Anyone? Congratulations, you've worked on a single version of
grep your entire life. Have a cookie.

Pretty much the only consistent grep flag I know is -i. There's never been a
standard for naming and abbreviating flags, which means that for EACH program
you will have to learn new flags.

This becomes truly terrible when you get around to, say, git and iptables.
Have you ever tried to read git documentation? It is the most useless godawful
piece of nonsense this side of the Moon.

There's Google now, which means that the fundamental design issues of POSIX
will probably never get issued. "Just google it and paste in from
stackoverflow" is already standard, and people are already doing that for
5-10-year-old code/shell commands. What about 10 years from now, will googling
best DHCP practices still find that stupid post from 2008 that never got
actually resolved? How about 20 years?

I have honestly no idea how to even start fixing the problem. A proper
documentation system would be a start.

~~~
lioeters
> Have you ever tried to read git documentation? It is the most useless
> godawful piece of nonsense

I ran "man git" for the first time ever.

[https://www.man7.org/linux/man-
pages/man1/git.1.html](https://www.man7.org/linux/man-pages/man1/git.1.html)

Heey, that's actually pretty good! I don't think it's "godawful". In the
second sentence it recommends starting with gittutorial and giteveryday, for a
"useful minimum set of commands".

[https://www.man7.org/linux/man-
pages/man7/gittutorial.7.html](https://www.man7.org/linux/man-
pages/man7/gittutorial.7.html)

[https://www.man7.org/linux/man-
pages/man7/giteveryday.7.html](https://www.man7.org/linux/man-
pages/man7/giteveryday.7.html)

I must admit, I still occasionally (regularly?) search for "magic
incantations", particular combinations of flags for sed, git, rsync, etc. But
the man pages are my first go-to, and they usually do the job as a proper
documentation system. It's better than most software I've worked with outside
(or on top) of the OS, with their ad-hoc, incomplete or outdated docs.

~~~
FartyMcFarter
The issue with git is that no matter how well documented, the user interface
is horribly designed. For starters, how many different things does "git
checkout" do, and how many of them actually reflect an intuitive meaning of
"checking out" ?

~~~
Animats
Some time ago, some UI designer asked, on HN, for what open source program
should they build a UI to establish their reputation. I suggested "git". That
was rejected as too hard. They just wanted to put eye candy on a command, not
have to rethink its relationship with the user.

~~~
cxr
A few years back, a designer waded into the middle of the echochamber on some
HN thread about inculcating people from other disciplines. They wrote that "as
a designer" they did not consider Git (GitHub?) to be thoughtfully put
together or well-suited for the kind of work they do or something like that.
It was a short comment, about as long as that, and there was no flaw or faux
pas or even anything incendiary about it. HN wasn't having it, though, and
downvoted it mercilessly. (There were no responses to say _why_ it had been
downvoted; the subthread dead-ended there.) It's things like this that remind
me of the now-infamous comment in the Dropbox thread.

I didn't think at the time to bookmark it with my "hn sucks" tag, and over the
past year or two, I've tried several times to find it again, for reasons
similar to[1], but I've been unable to.

1\.
[https://news.ycombinator.com/item?id=22991033](https://news.ycombinator.com/item?id=22991033)

------
ptero
Those articles from mid-1990s are great to read; they are both funny and
informative. Even as many of the points they make are still valid today, even
more valuable is the ability to look at succeesses or failures with 20+ years
of hindsight.

I feel the pain of the user in this particular case. But I also understand the
frustration of the people who wanted to write their own smaller programs with
less restrictions than the well architected but highly constrained VMS
architecture. And ignoring such users can topple a better technology. That is
why (a technically horrible) DOS spread like wildfire on personal computers,
super unreliable Windows (Win 95 had to be rebooted daily) killed a much more
robust OS-2, etc.

We can call such users who want capabilities quickly, even if they are not
fully reliable "ignorant lemmings" or whatever, but ignoring them when a
competitor does not is very risky. My 2c.

~~~
tcbawo
A similar argument has also been made about JavaScript. When it comes to
market share I guess most users don't care how elegant the solution is under
the hood.

~~~
FpUser
I was about to post the same. The first thing that came to my mind after
reading "technically horrible" part of comment

------
rcarmo
We had to use VAX/VMS when I was a freshman, decades ago. The “Computer
Center” had dozens of VT220s hooked up to it via serial “hubs” in the main
building basement (the very stereotype of a nerd dungeon).

It wasn’t half bad. As multi-user systems went, it was actually quite good and
we ran a number of projects on it before moving off to PCs, Sun Workstations
and Linux in general.

I remember all the staples of the era: using Kermit to upload our assignments
to the thing, dialing in from home at 2400bps, hacking our way “out” to the
Internet, running out of our 50MB quota due to mailing-lists and uuencoded
files fetched via mail gateways.

It was a lot of fun, and AFAIK there are some working VMS emulators around
that I could install on a Raspberry Pi (and likely get a faster multi-user
system than what we had then for hundreds of students).

I say it was an experience worth having, but largely (functionally)
indistinguishable from a UNIX machine when accessing it via teletype (glass or
otherwise).

~~~
m463
> largely (functionally) indistinguishable from a UNIX machine

Although they were functionally similar, there were some practical
considerations...

I will say that path names in unix were a simple and elegant , compared to
what VMS used.

I recall VMS paths were something like [foo.bar.bletch]something.txt

At the time this was a little cumbersome, but looking back it is much worse.

~~~
TheOtherHobbes
That's a cosmetic difference.

The real difference is when you try to work out whether your binaries are
(supposed to be) in /bin, /usr/local/bin, /sbin, etc, whether settings for a
specific application and/or daemon are in $config or $.cfg or $.conf or $.cf
or $d_config or .ssh and .bshrc in your personal directory, and where your web
server and mail logs are.

Because they might be in /var/log - or equally they might not.

Unix was "designed" by hyperactive comedy racoons with ADHD. There's no reason
- beyond lack of attention span and professionalism - why basic features and
expectations couldn't have been standardised. But the Unix way is to get
something sort of working without paying much attention to what other people
are doing, lose interest in it, and move on.

Or to hammer away at something for decades adding more and more obscure edge
case config options in a text file, all of which need to be set carefully
because otherwise the application fails - probably silently, maybe leaving a
log message somewhere completely unexpected - and most of which are irrelevant
to 90% of users who just want Something That Works.

~~~
PaulDavisThe1st
It's not a cosmetic difference. Unix has a single-rooted filesystem. VMS did
not (like DOS). That doesn't invalidate your complaints about file naming and
location, but it isn't a cosmetic difference.

------
codesections
> [the name] Grep suggests to me that the author of this one had been reading
> too much Robert Heinlein (you grok?), or possibly --- and this is in fact
> quite likely --- was under the influence of psychotropic substances at the
> time.

As funny as this is, the _actual_ origin for "grep" is even more interesting –
and, at least to me, quite mnemonic. "grep" comes from ed, and stands for the
command "g/re/p", that is

    
    
      global/regular expression/print
    

[https://en.wikipedia.org/wiki/Grep](https://en.wikipedia.org/wiki/Grep)

~~~
throw0101a
"Where GREP Came From" by Brian Kernighan from Computerphile:

* [https://www.youtube.com/watch?v=NTfOnGZUZDk](https://www.youtube.com/watch?v=NTfOnGZUZDk)

For those unfamiliar, Kernighan is the "K" in K&R C and the "K" in AWK:

* [https://en.wikipedia.org/wiki/Brian_Kernighan](https://en.wikipedia.org/wiki/Brian_Kernighan)

~~~
jcims
Great interview with Brian on Lex Fridman's podcast -
[https://www.youtube.com/watch?v=O9upVbGSBFo](https://www.youtube.com/watch?v=O9upVbGSBFo)

~~~
mercer
I listened to that one last night, and then a few more. He's been in rotation
for a while now.

I don't know how I ended up subscribing to Lex Fridman's podcast, but it's
both wonderful and somewhat bizarre to me.

On the one hand, he comes across as the kind of character you'd expect in a
horror film. Meticulous, well-dressed, friendly, but affectless in his speech
and oddly emotionless and formal.

But then the actual questions he asks, and the observations he makes, are IMO
a step above most interviewers. I can think of a number of 'greater'
interviewers, but he's definitely well in the 'very good' range.

My apologies if Fridman reads this comment. I definitely don't want you to
stop doing things the way you do :). It's just somewhat different, in a
strangely 'boring' way, that I'm not used to from most good podcast hosts that
I'm familiar with. Most are, sometimes to the point of irritation, exceedingly
affable and chatty.

~~~
throw0101a
> _On the one hand, he comes across as the kind of character you 'd expect in
> a horror film. Meticulous, well-dressed, friendly, but affectless in his
> speech and oddly emotionless and formal._

Are you aware of the _The Report Of The Week_ channel run by 'Review Brah'?

* [https://www.youtube.com/user/TheReportOfTheWeek](https://www.youtube.com/user/TheReportOfTheWeek)

------
dhosek
I really miss VMS. Adding a command wasn't just simply a matter of throwing an
executable on the path, it had to be declared, with two options: One was to
declare a command that had its own argument parsing or the better option,
there was a way of doing external configuration of the arguments using (iirc)
a .cld file which provided a way of specifying all the arguments and options
for the command in a straightforward way. I did this for all the TeX programs
around 1990 (I think this might have been the first TeX version which didn't
use separate binaries for iniTeX vs TeX). It was nice because the OS
automatically managed things like abbreviations so you only needed to type as
much of a verbose command or option as was necessary for it to be unique. So
while the command might have been `DIRECTORY /FULL` one could just type `DIR
/FU` to get the same result. The VMS help system was also fantastic and
allowed for easy browsing of available commands and their instructions in a
way that man+apropos only approximates (not to mention that the tradition of
using relatively verbose but understandable commands and options made it
supremely usable.

~~~
laksdjfkasljdf
never understood why so many people complain about the short commands, but
nobody contributes a central list of aliases with verbose names to them.

alias copy_files_from_one_place_to_another=cp

this tells me short names aren't a problem to begin with. And everyone would
have the same discoverability problems with longer ones.

~~~
dhosek
Verbose is a relative thing. `copy` is an intuitive name, `cp` not so much.
Likewise, it's a lot easier to remember how to do something like copy
/exclude=* .bak [.work] [.prod] (that space after the asterisk shouldn't be
there but markdown) than the equivalent command in a Unix shell, apparently:

$ shopt -s extglob # to enable extglob

$ cp work/!(*.bak) prod/

~~~
laksdjfkasljdf
You realize all that has a huge IF you are a native english speaker, right?

the glob stuff is a case of power vs convenience. but you can do very similar
with bash [] or {} syntax.

not being apologist, but I like unix approach of blurring the lines of a
computer user and programmer. It makes everyone up their game both ways. Users
being more demanding, and code being more accessible.

------
somesortofsystm
I remain amazed that the #1 shipper of Unix systems today is .. Apple.

I grew up on Unix in the 80's, cut my teeth on MIPS RISC/os and then Irix and
SunOS and all the joys of the very first days of Linux, oh my .. and I was
fully prepared to be an SGI fanboy for the rest of my life - and then, they
abandoned Irix and shipped NT. _sadface_

So when the tiBook came out, and it was promised to have a Unix on it, I
jumped off my Indy and O2 workstation onto Apple - a company I'd never
imagined, in my wildest 80's and 90's fever dreams, would become the one
company still standing in the Unix wars. The tiBook was just _soo_ good, and
despite all of its warts in the early days, MacOS X' underpinnings with Darwin
were just good enough to swing the decision to use it as a Unix-developers
machine. And, it has been solid for 20 years as a platform in that regard,
although the writing is definitely on the wall for us Unix geeks who
nevertheless carry a Macbook.

If only SGI had made a laptop, and not been wooed away from sanity by the NT
freaks. Can you imagine if SGI (Nvidia) had made that laptop before Apple did
.. ? I sure can.

~~~
kristopolous
They made a rather strange o2 laptop that never made it to production.

A number of legendary companies with great potential misstepped, GRiD,
Blackberry, Be, MasPar, Thinking Machines, GO corp, Intergraph, heck go back
to the Evans & Sutherland LDS-1 in 1969 or when BBN decided not to get into
hardware after making the first internet hardware ever, the IMP. Or how about
how SRI fumbled the ball after Englebarts work or SDS, who made a bunch of the
NASA Apollo hardware.

The world is littered with great technology companies that didn't stick around
because we determine success and failure by handshakes on the golf course.

~~~
somesortofsystm
Yeah, I saw the Indy-laptop once during a demo in Hollywood .. was definitely
a clunky pile of junk.

You're right about that golfing handshake.

~~~
kristopolous
Sgi hardware wasn't that great from about o2 onwards. I used a lot of those
machines and they were kinda dogs

------
lalalandland
"Anyway, have you ever tried to use man? It's fine as long as you know what
you are looking for. How would you ever find out the name of command given
just the function you wanted to execute? You can't. "

One of my gripes with UNIX systems is how opaque they are

~~~
jiggawatts
PowerShell takes the UNIX philosophy, cranks it up to 11, and makes commands
trivially discoverable.

~~~
yabones
There are things that powershell does well, but I wouldn't say that it's
really all that unixey. It relies far too heavily for the user to have an
understanding of the windows object model, which frankly a lot of sysadmins
don't have and refuse to learn. In unices the pipeline is just that, a way to
send a stream of bytes from one place to another. In pwsh it's more
complicated than that since it passes an object along with the preceding
command's metadata.

In my personal experience, powershell is fantastic for writing scripts and
tooling, but not really so great for actual use as a shell. What makes a good
shell is speed and 'muscle memory', imo.

~~~
majkinetor
Thats nonsense really. What windows object model ???

Your 'simple' pipeline becomes hard core once you take into account all other
things you require like grep, awk, sed, xargs, mount etc. Hack, even basic
boolean stuff is from another dimension with executables like `[` or
`true/false` (yeah, I know mostly builtin nowdays)

~~~
enriquto
> your 'simple' pipeline becomes hard core once you take into account all
> other things you require like grep, awk, sed, xargs, mount etc.

That's the whole point of unix. Non-integrated tools that talk to each other
using plain text.

~~~
Arnavion
It's exactly the same for PS.

Just as you have to look at a text output of a unix command to figure out how
to parse it and extract the subset of information you need from it, so do you
have to look at the help metadata of a PS command to figure out how to extract
the subset of information you need from its output.

The advantage of having objects instead of text is that if you thought you
could parse the filename of grep matches by substring each line of `grep -H`'s
output with `:`, you failed to account for filenames with colons. With PS's
sls, its help metadata tells you it outputs
`Microsoft.PowerShell.Commands.MatchInfo` objects, and the documentation for
that type tells you it has a `Filename` property of type string.

One PS command is not "integrated" with another PS command; they're all
integrated to .Net and a bunch of built-in PS types.

~~~
enriquto
Thanks for your answer, and I see where you are coming from. But your unix is
not the same as my unix. For me, unix means programs that can be easily
written using only the _getchar_ and _putchar_ functions. If you have "types"
and whatnot, it's not unix.

~~~
jiggawatts
It's not often you hear someone criticise a tool for not having enough
unnecessary encoding and parsing steps involved.

~~~
enriquto
Are you talking about people who promote JSON-RPC and whatnot? Sure, they are
nuts, but what does it have to do with unix?

------
brazzy
Are we going to completely ignore the fact that this purported Requiem for VMS
tells us in great detail why Unix suxx - but literally (and I do mean
literally) _not a single thing_ about why VMS should be mourned.

If this is how VMS advocacy looks like, I'm not surprised it disappeared.

~~~
pierrebai
Isn't it?

What's more, and this keeps behind repeated and emulated, I do not understand
how being arrogant, condescending, insulting, full of bad faith scorn and
hatred is supposed to make me be part of a group or use a tech, API, stack,
framework or what not.

"I'm a VMS user", they seem to say, "watch me be a nasty person."

------
thenoblesunfish
The complaints here don't seem to be much about how Unix works in a deep way,
but rather the particular language/syntax which is used to interact with it
("rm", etc.). While that's certainly annoying in many ways, so are almost all
languages in common use. You could write a lot about how horrible English is
(or German, as Mark Twain famously did). But English is a useful standard -
politics, inertia, and the value of a common language are forces far stronger
than the fact that the language is more annoying than it could be.

Am I being trolled? Probably :(

~~~
alxlaz
The back story to this is that the DIGITAL Command Language (more or less the
equivalent of an Unix shell), with its excellent filesystem-level features and
the environment around it (e.g. the well-written and extremely thorough
documentation) was very much light-years ahead of anything you could get on
most Unix environments at the time. Going back to the Unix shell felt a bit
like a step back.

FWIW, there are plenty of complaints about things other than syntax after the
fifth paragraph or so, too ;).

~~~
CaptainZapp
Ah yes!

Documentation for VMS and the whole shebang of layered products.

Until this day, in my book, the absolute gold standard when it comes to
documentation.

Having the arguably best tools suite to develop (LSE, Debugger, what have you)
also didn't hurt.

I still believe it the best operating system I ever worked with.

Ironically the article describes quite precisely why DEC failed.

Most of the company had a visceral hate of anything Unix and anybody involved
with anything that even smelled faintly of Unix was a second class citizen.

Well, looking at Ultrix may have been the main reason for that. Ironically and
decades later they had one of the best Unix offers on the market with Tru64
Unix.

DEC was in many, many respects an awesome company. Shame that Compaq (and
later HP) never really had a clue about what they actually got there.

Source: I worked for DEC from 90 - 94.

~~~
kyuudou
>Most of the company had a visceral hate of anything Unix and anybody involved
with anything that even smelled faintly of Unix was a second class citizen.

Reminds me of ye olde UNIX-Hater's Handbook[0]

0:[http://web.mit.edu/~simsong/www/ugh.pdf](http://web.mit.edu/~simsong/www/ugh.pdf)

------
0-_-0
Quote from the article:

Douglas Adams may well have had Unix in mind when he described the products of
the Sirius Cybernetics Corporation thus: "It is very easy to be blinded to the
essential uselessness of them by the sense of achievement you get from getting
them to work at all. In other words --- and this is the rock solid principle
on which the whole of [its] Galaxy-wide success is founded --- their
fundamental design flaws are completely hidden by their superficial design
flaws."

------
fredsmith42
I just finished a contract at a client that still has a live OpenVMS system
running a mission critical application. It was interesting to compare my 25
year old fond memories of the OS with the practical experience of using it on
a daily basis. No command line history, crazy long file paths, and case
insensitive passwords were a shock. On the other hand, the built in DCL
programming language was a saving grace. My wife, amazingly, found my decades-
old copy of the "Writing Real Programs in DCL" book. Because of that, I looked
like a VMS superhero.

------
gcc_programmer
I think the author is arrogant and closed minded. He underestimates the levels
of complexity of software systems and strives for some utopian "elegance of
design" which I think doesn't exist. The only thing which matters is to
deliver results and move forward. Requirements change: back in 1990 the
internet existed in a much much smaller form, and WWW was just about to be
invented. Linux had yet to be invented by Linus. Unix and Linux survived and
adapted, VMS (or whatever it was named) didn't. Also the argument about
"speaking English" is again so arrogant and closed-minded: not all of us are
native English speakers. To me rm or delete or del or banana doesn't really
matter...

------
normanmatrix
I have a feeling that Windows Nano Server w/ Powershell + .NET Core might make
this text relevant again soon. Having a consistent OS that scales from
containers to servers and desktop is a big benefit in corporate. Now if it
matched performance of an Alpine, played well with WSL, and Microsoft would
manage to push the major open source stuff for compatibility. I'd give Linux 5
years. But only time will tell..

~~~
3np
I really, really hope that you're wrong.

I optimistically hope a not-too-future (within 10y?) major version update of
Windows is in reality a *nix distribution. As long as they can keep runtime
compatibility with older versions of Windows software, I think it'd be a big
win for Microsoft for various reasons.

The only real challenge would be to get driver vendors in line.

~~~
ThinkBeat
Why would you want that? That is the last thing I want.

It is like saying that in ten years we will only have Pepsi Cola. As the only
soft drink. You can get Pepsi Cola Mint ,Pepsi Cola Cherry, Pepsi Cola regular
etc.

But no matter what it will always be Pepsi Cola.

I want more, a lot more viable operating systems than we have ow. Now is a sad
place to be.

Linux was never created to be a modern operating system.

Parts in Windows NT -> were derived from VMS, though more and more of it
removed. Some parts were removed and had to be reinvented (WSL).

Bacm in the days, you could pick different hardware you could pick different
oses (often tied to specific hardware).

I liked Atari ST TOS/GEM. I thought it was way ahead of its time.

I did not think that PCs in the beginning of the Atari STs were even
comparable. Lots of peple loved the Amiga, great machine. Some people had the
Archimdes. (ARM). You had Macs with PowerPC. Lots of choice and lots of
competition.

Now you buy a computer of a specific design with little direct competition,
though that is improving now.

And you can pick between Linux or Windows.

A single computer architecture. A singe choice +1 for operating systems.

I would really want to own a POWER powered Linux machine but the are
tragically expensive.For the most part they do share the same architecture
still.

On mobile, we have Android or iOS. Android has a lot of shared architecture
for obvious reasons and iOS most certainly does.

It is like the American election, which white geriatric misogynists would you
like?

Linux is fully geriatric. WindowsNT is getting there too.

Can we please have a couple of teenage operating systems? Some new viable
babies?

~~~
3np
Some that you forget: BSD (Net, Open, Free, Dragon), macOS, OpenVMS, Haiku.

And hey, I’m sure if you work on implementing one maybe others will be
interested in writing software for it. And that’s kind of the point, I guess:
it’s a huge undertaking with little financial value for a business as opposed
to extending on existing work. Standards give us a common language.

------
tromp
> There is no effective way to find out how to use Unix, other than sitting
> next to an adept and asking for help

That's a rather unfair claim. Hundreds of books and tutorials have been
written to introduce people to Unix...

~~~
TheOtherHobbes
Now.

This was written decades ago, when computers mostly lived in universities and
the only way to learn Unix was by etc.

~~~
tialaramex
Well it's written in about 1994 by the look of it. So, this is the point where
I was learning Unix and it's _after_ the point where a Finnish university
student's "hobby project" is building one from scratch, a pretty good one it
turns out.

It's also coming either after, or at the same time as books like "The Magic
Garden Explained" which explains SVR4 in considerable detail.

You should read stuff like this as at least 90% sour grapes. Is there value
from some of the criticisms along the way? Sure, but mostly they're just angry
because the thing they liked is clearly going away.

------
ubermonkey
My first corporate job in computing, in 1994, was at TeleCheck. They used an
all VMS environment, though by then the machines themselves were about 30%
Alphas and not actual Vaxes.

The denial about the platform's future was SUPER strong. TeleCheck IT and
software development, at least in those days, was staffed by people who would
either move on quickly or stay forever. They mostly hired right out of
college, too, so the lifers were people who had never worked anywhere else.
(This kind of employment monoculture is pretty destructive, IMO -- get some
new ideas in there!)

This created a super weird environment. Everywhere else I worked back in those
days was rife with industry publications, curiosity about how other systems
worked, excitement about developments in software or networking even if they
were on stacks or platforms other than whatever the site used, etc.

Not so there. I think this may have been mostly because to read, say,
InfoWorld in 1994 would have made it much harder to avoid how narrow a niche
they were occupying. The nature of the systems and software there meant people
who stayed were gaining skills not useful anywhere else; pretty much
everything (even the database system) was built in-house.

People were out the door constantly, going to other big tech employers in the
area to either (a) pick up more marketable skills or (b) pick up a huge bump
in pay. That's what I did after 2 years. (Turnover in the dev group was
something like 35% a year, which is HELL on institutional knowledge.)

All that said, you could see SOME of the appeal of staying on VMS from their
POV. Clustering was a big deal, because downtime cost dollars. File versioning
in the OS -- which I _still_ haven't seen implemented the same way anywhere
else -- was fucking genius and made rolling back a bad release almost trivial.

But when you decide to ignore where the market is going, and stay on a doomed
platform, there are real costs to pay.

------
soco
I remember I was upset when we moved off OpenVMS (because market pressure) but
it was so long time ago, that I don't even remember why I was upset. But I
definitely resented the incredible gamble they took with Itanium, even before
the gamble crashed.

------
lixtra
The last modified header suggests (2000)

    
    
        Last-Modified: Fri, 08 Dec 2000 14:02:09 GMT

~~~
xioxox
The letter is from 1994. See Starlink Bulletin #13, pg :
[http://starlink.eao.hawaii.edu/starlink/Bulletins](http://starlink.eao.hawaii.edu/starlink/Bulletins)

~~~
Macha
It also mentions 1969 as being 25 years prior.

Which also makes this article closer to 1969 than the present day.

------
ninefathom
Meh... I think the author of this article was cherry-picking a bit. VMS is no
cake-walk either, and has its fair share of idiosyncrasies.

So you're a system administrator, and you want to change a user's password?

$ set default sys$system

$ mcr authorize

UAF> modify jimbob /pass="whatever"

UAF> exit

...while on just about any nix, one might simply:

# passwd jimbob

<< enter the new pw twice when prompted >>

For every example of the original author complaining about *nix, I could
probably find a counter-example of VMS being awful. Really, I think it reduces
down to this: people prefer to stick with what they're used to, and what they
were trained on. Anything else is "awful" and "inferior."

------
squibbles
VMS - Dying, dead, or reborn? For those who lament the loss of VMS, know that
Windows NT carried on. [0] (By the way, did you know that WNT is a Caesar
cipher for VMS?)

Long ago I had my personal trials and tribulations with Unix. Fortunately,
OS/9 [1] was there to help with that journey of discovery.

[0] [https://www.itprotoday.com/compute-engines/windows-nt-and-
vm...](https://www.itprotoday.com/compute-engines/windows-nt-and-vms-rest-
story)

[1] [https://en.wikipedia.org/wiki/OS-9](https://en.wikipedia.org/wiki/OS-9)

------
gnufx
Some context might be useful.

Starlink was one of two somewhat similar subject-specific networks set up in a
period of enlightenment by the UK Science Research Council (as I think it was
then) operating in the 1980s, particularly for largely interactive analysis.
The other was an un-named and ill-publicized one for nuclear structure as
opposed to astronomy. I don't know about Starlink, not being an astronomer,
but I guess it was also rather ahead of its time, like the nuclear structure
one.

Obviously it had changed in astronomy by then, but I didn't see the attitude
that physicists (and later, structural biologists) shouldn't write software or
build the necessary hardware before or around that time. We had people and job
titles like "physicist-programmer", and did what was necessary, and we did fit
the facilities to the problem (when not working at foreign labs). The software
systems were designed to be user-extensible anyhow, in our case.

Personally I was glad to have the lightning-fast interactive graphics system
on the nuclear structure GEC systems, and not the stodgy performance -- even
when they weren't running something like Macsyma -- of all the VAXen I used.
VMS was somewhat inscrutable to a physics hacker anyhow, so I don't understand
why Unix made it more difficult, though I hold no particular affection for
Unix.

------
rakah
Came for a discussion of OS/2, stayed for the comparison of Ken Thompson to E.
E. Cummings.

------
scroot
> One can only conclude that the makers of Unix held, and still hold, the
> ordinary computer user in total contempt, and this viewpoint seems to me to
> be mirrored in the attitudes of the people who are inflicting this awful
> system on the rest of us. Does Unix's enormously steep learning curve have
> any function other than to deter the faint-hearted, those who may want to
> use computers without necessarily dedicating their lives to them? It does
> not seem fanciful to suggest that Unix is primarily about separating out the
> elite from the proletariat, the real programmers from the quiche-eaters.

I think it's worth pointing out that in the 80s/90s (and in the research in
personal computing that preceded that period), there was a very different
attitude about what it could mean to "use a computer." The demarcation between
a "user" and a "programmer" was not so clear. Today it is stark. The author is
sort of joking with this line, but it's likely that the unixification of
computing has contributed to this divide.

~~~
rbanffy
> but it's likely that the unixification of computing has contributed to this
> divide.

It took a long time until a meaningful part of humanity got access to
computers. The fraction that had access to timeshared machines via serial
terminals was tiny.

In the age of the PET, the Apple II, the C64, it was usual to boot up your
home computer and be greeted by a BASIC prompt (REPL, if you allow me). It was
an immediate introduction to programming - a language was there, built into
the computer's firmware, and you could just start writing code one second
after powering the thing up. You could write programs that looked comparable
to what you could buy at a store or type in from a magazine.

CP/M and then MS/DOS were a bit different. You were dropped into an OS shell
rather than a programming environment. In order to program, you needed to
explicitly start an interpreter or a text editor. You still could write
programs that looked like professional apps on the platform, but you had to
get a language, which was not usually provided.

Then the GUIs came. Now, in order to write programs for Windows, or the Mac
(oh boy!), or anything else graphical, you needed to get developer tools.
Microsoft's came in a crate. A Hello World app would be hundreds of lines. In
order to make something useful that looked decent, you'd need to learn a lot.
By then, most timeshared system users were left behind, as terminals couldn't
cope with user expectations.

What was once a small crack is now a vast chasm.

~~~
scroot
I'm with you, but want to add that this transition isn't the fault of GUIs.
The group at PARC that invented the GUI that everyone else commercialized
_had_ invented the GUI equivalent of "booting into a BASIC" \-- Smalltalk.

Which brings me to another point: this divide has gone hand in hand with the
idea that parts of a computing system should be universally swappable. But
that prevents any "holistic" nature to a computing system, and therefore we
get caught up in the idea of individual languages rather than environment and
how everything works together.

------
mimixco
Unix is the ultimate example of "worse is better." VMS, which is what the
article is about, was definitely more elegant, easier to grok, and far better
documented. So was IBM's VM which had what we now call "containers" working
simply and reliably 30 years ago.

~~~
michaelcampbell
> VMS, which is what the article is about,

Interesting then that it barely mentions VMS.

~~~
mimixco
Yes. It's poorly written and spends too much time talking about Unix without
explaining VMS, which purports to be the topic.

------
codezero
This brings back memories (of ten years ago).

I worked in astrophysics for a while, and maintained a large codebase that
included a ton of Fortran, and had a lot of references to Vax VMS systems
within the comments as that's what the team I worked with used mostly through
the 80s and early 90s before moving to Linux.

One of the things that didn't seem to happen was C - there was some code here
and there (I wrote a module for IDL - IDL is a proprietary scripting language
that mimics and interoperates with Fortran in a lot of ways). Fortran was able
to stick around because of a lot of work on great compilers, and I think, in
general, the great speed up of commercial hardware has made the need for hyper
optimization less necessary for a lot of day-to-day physics study.

~~~
dfalzone
Funny, today one of my coworkers was talking about how she used IDL at one of
her previous jobs, and later I happened to see this comment. Neat coincidence
:)

~~~
codezero
I think there is a bias for that, but I always love hearing of others who’ve
used IDL. You should ask her if she knows of the coyote guide to IDL - I lived
by it around 2008. Not sure if it was niche to my field or more general, or if
it wasn’t a great resource before or after it was great for me :)

------
mcswell
I started at a company in 1984 where VMS was pretty standard, running on 80x25
terminals. (They also had Xerox Stars, some people ran 1980s-era Macs, and
there were a few LISP machines.)

Then someone introduced me to a sort of minimal unix that was running on top
of VMS. Although I had never seen Unix before then, I immediately saw the
advantages and switched over. If I had to do something on VMS, it was painful.

Later the company got first generation Sun workstations, running unix on
metal, with large (for the time) graphic displays. SO advanced...and so much
fun (except when I had to do stuff in C--mainly our team used LISP or Prolog).
Eventually we even got an ARPAnet connection. But I digress.

------
nsajko
Unix was at first a quite different experience, I think. The Bell Labs Unix
had only a few utilities, each one was simple, and the documentation was good.
There's a nice relevant quote: "Cat went to Berkeley, came back waving flags."
It's about how the commands (even _cat_ ) got many additional options after
third parties got their hands on them (or reimplemented them).

There are manuals for old Unix versions here, maybe start with V7 if
interested: [http://man.cat-v.org](http://man.cat-v.org)

The paper "Program design in the UNIX environment" or the book "The UNIX
Programming Environment", both by Rob Pike and Brian Kernighan, might be of
interest.

------
icedchai
One of the first multiuser systems I had access to, back in the 90's, was a
VMS box. Many years later, I bought an Alpha off of ebay, and have a VMS box
of my own! I rarely boot it up because it sounds like a jet engine. It's a
whole different world.

~~~
UncleSlacky
You can run it on a Raspberry Pi these days:

[https://blog.poggs.com/2020/04/21/openvms-on-a-raspberry-
pi/](https://blog.poggs.com/2020/04/21/openvms-on-a-raspberry-pi/)

~~~
icedchai
Yes, I know. There is something about having the actual bare metal machine
that I enjoy. I am a bit of a retrocomputing enthusiast. I am also running
Alpha, not VAX.

------
ginko
Well, maybe more people would have used VAX/VMS or OpenVMS if it had been open
source.

~~~
pjmlp
Well, that is the only reason why UNIX won and we got stuck with C, it is hard
to win against free beer with source code available.

The prices that Bell Labs was allowed to charge for symbolic UNIX licenses
were a gift, when compared against traditional commercial OS prices in the
70's.

~~~
imglorp
One more advantage was that C and UNIX were designed for portability. They
started on PDP-7 and spread onwards from there.

VMS was designed to sell mainframes so your choices were limited outside of
that until the pedestal and workstation market arose around 88 (Alpha),
MicroVAX, etc. Even then it was only DEC hardware. That's how you kill an OS.

~~~
pjmlp
There were already portable systems outside Bell Labs back then, and anyone
that has written UNIX software knows how "portable" C actually was back then.

Had UNIX been sold with a price tag similar to VMS, without source available
for universities to play around, and it would have been long dead by now.

------
jgtrosh
_à propos_ means apt/appropriate as a noun in french and in English, but the
phrase « à propos de … » means “about …” ; which is much more (wait for it) à
propos. It is kind of weird that it was used in Unix wherein only the noun
form made sense ; and even with french in mind the “apropos <command>” syntax
(without « de ») makes it sound weird. In practice I don't think I've ever
used it, and when I was a noob I often lamented the lack of a high-level
documentation of Linux which would introduce the appropriate man/info pages.

~~~
alxlaz
> In practice I don't think I've ever used it, and when I was a noob I often
> lamented the lack of a high-level documentation of Linux which would
> introduce the appropriate man/info pages.

It was introduced in 3.0BSD, it's pretty old :). As for usage, while I'm not a
native French speaker, I don't think I've heard it used as a noun _that_
often. I can't speak (heh :P) for Canadian speakers of French -- perhaps there
are different norms for the use of various idioms there, and I learned French
in Europe -- but over here I don't think anyone ever had trouble figuring out
what the apropos command would do...

~~~
jgtrosh
Not saying it didn't exist, just saying it's not an introductory high-level
Linux user-space documentation.

Also not saying apropos is incomprehensible, just that 1 it sounds weird as a
Frenchman, 2 there could have been a simpler alternative: “about”.

~~~
alxlaz
Lots of Unix commands could have had better names :).

FWIW, though, the noun form is even less uncommon in languages that borrowed
the expression from French. In my native language (also a Latin language, so
borrowing it was straightforward), its use a noun is very limited. Saying that
a book is "full of à-propos" (as in "pleine d’à-propos"), for example,
wouldn't mean it's apt, or very relevant, it would mean that it's full of
subtle, possibly contrarian or lewd motives. That's more or less how it's used
in English, too.

To _non_ -French speakers -- given when apropos showed up, the Unix audience
was very American-centric --, who've only heard it used in the other sense, it
actually sounds very appropriate, possibly even more appropriate than about,
given how apropos (the program) works.

~~~
jgtrosh
Yeah, I guess it's not that bad. (Also it's not a noun, but an adverb, my
bad.) Then my only gripe is that it doesn't really sound correct in the <verb>
<noun> syntax, but then again many other commands don't either (like cd,
more/less/head, man, du)

------
the-dude
Key quote for me :

> Actually, the answer is easy. Operating systems cost hardware manufacturers,
> and therefore consumers (us), money. Unix is free, and whatever its defects
> at least no one blames the hardware manufacturers

------
jovial_cavalier
>is "rm" (pronounced "remove") really synonymous with "delete". Do I call the
deleters when I want to move house. How would my first cousins once-deleted
feel?

I don't think it's good to make the command name an English word. Perhaps it
would make the command easier to discover, but it would also introduce certain
unpredictable connotations.

For instance, if the command was called 'remove', then someone might presume
that it was being 'removed,' and 'replaced,' somewhere else. 'Remove' has the
connotation that you are taking something and necessarily putting it somewhere
else, which is not what 'rm' does.

'delete' has less ambiguity, but I'm not confident that similar mixups
wouldn't happen. It's better for the command to be something ideosyncratic so
that the user goes in with no assumptions about its function.

To paraphrase Rob Pike, the command is not remove, but 'rm'. It's called 'rm'
because rm is what it does.

[http://mail.9fans.net/pipermail/9fans/2016-September/035429....](http://mail.9fans.net/pipermail/9fans/2016-September/035429.html)

~~~
santoshalper
But that's really nonsense, right? It's clearly intended to be a shorthand for
"remove" that saves a few keystrokes, just as "cp" is a shorthand for copy.
And remove is ambiguous compared to delete. These are petty complaints, but
Rob Pike's attempt to dodge them seems a big disingenuous.

------
dsego
> Unix seems to consist largely of arcanae which be learnt only by taking
> instruction direct from a priesthood who seem largely to be stuck in the
> anal-retentive stage.

Tell that to linux nerds who encourage people on forums to install arch from
scratch if they want to learn about how computers work.

~~~
floren
> install arch from scratch if they want to learn about how computers work

15 years ago it was "install gentoo to learn how computers work" but all it
really teaches you is how the gentoo/arch install procedure works, or, more
likely, how to type in commands from a HOWTO one after another.

------
PaulHoule
VMS had the "sound the fire alarm and leave the building" attitude towards
exceptional events that Windows NT has. That's why you used to see the "Blue
Screen of Death" on Windows all the time, but they changed it in the early
days of Win 8, when they realized you were violating the human rights of
tablet users, even laptop users, and suddenly changed it so the system would
reboot, reopen your windows, and pretend it didn't happen.

The flip side is UNIX which tries to pretend things are OK even when they are
strange. For instance, you might fill the disk with a log file, then 'rm' the
log file and note that the disk is still full. The file is still taking up
space on the disk, even though it is invisible. The space isn't released until
that process closes the file or gets killed.

Contrast that to VMS which, given an impossible situation, will do nothing.
(as in "not do anything") Recovering from a "full disk" on UNIX is usually
routine, but on VMS you will probably be recovering from tape or calling the
factory for support.

This guy

[https://www.youtube.com/watch?v=r1Esq1l0Yoo](https://www.youtube.com/watch?v=r1Esq1l0Yoo)

(the year after that album was released) got mad because a midwestern state
university student had a "non-traditional" student spamming hundreds of USENET
newsgroups. The university couldn't throttle the spammer because he'd had
already won a "free speech" lawsuit against the university newspaper.

The antagonist saw this as an existential threat to the net one Friday night
and went Ender Wiggen on his ass.

He thought he'd fill the disk quota on the spammer's account on the VAX/VMS
system so the spammer couldn't log in and delete his received email -- an
excellent implementation of quotas meant you could usually get the victim
running to 'mommy' (the sysadmin) for help.

The antagonist used ftp-to-email gateway servers to vastly amplify the attack
and confuse people about the origin. At the last minute he found a way to get
the ftp-to-email gateway servers to send email commands to each other in a
cascading way.

The target campus went down in about 30 minutes, and, logged into the VT-100
terminal in his dorm room, the antagonist realized that he'd miscalculated and
the attack was possibly 20,000 times larger than planned.

The target campus didn't come back until Tuesday evening, probably the disk
was full for the whole VMS Cluster.

The spam stopped. They never proved anything, but a few days they shoved an
empty SUNtape into the antagonist's hand that allegedly held the files from
his account at his campus central computer center account -- just "being a
jackass on social media" was a good enough reason.

About a decade after that incident, I had "the same thing" happen to an email
server running Linux on this box

[https://en.wikipedia.org/wiki/Cobalt_Qube](https://en.wikipedia.org/wiki/Cobalt_Qube)

during the 'Love Letter' virus crisis. I had email accounts getting millions
of emails a day on that badly underpowered machine, a beta-test model with a
2% slow real-time clock.

In 15 minutes I had the email server shut down, the disk full condition
cleared, logs working correctly again.

I brought up qmail and did not like what I saw in "top" so I shut it down. I
wrote a uniq|bash|awk script that picked out virus-sending ip addresses from
the logs and piped the bash script into bash to block them at the firewall.

I brought qmail up and it was good, good enough to go to bed. By the next
morning the viral load was serious so I automated the firewall script,
installed an anti-virus scanner, etc.

In wartime, VAX/VMS gives up the fight but UNIX soldiers on.

------
gooseyard
I was getting started in the field right around the time this was written,
attending college (and later being employed) at a university that was an all-
DEC shop. I wound up being introduced to VMS and Unix at the same time. I
greatly preferred Unix at the time, but looking back on it, it was for a
completely different set of reasons than I would apply now.

The biggest was that I wanted a machine of my own to work on, not just a shell
account on a big Vax with a tiny quota. The price of the lowest-end, used
machine with a hope of running vms at a decent clip was astronomical, upgrades
were proprietary and expensive. I think the machine in my office at the time
came with a perpetual single user vms license, but if I wanted to create shell
accounts for my friends, buying license paks was also quite expensive.

If I wanted to, say, run irc on the vaxstation, instead of the handful of
popular clients that one could build on Ultrix, there was a single vms port,
and to build it would require licensing a compiler. Downloading the source
would require buying the tcp/ip module license as it would be years before
that was bundled in.

So if I ditched the Vaxstation and got one of the MIPS based DEC pizza boxes,
I could put Ultrix on it and at least get the GNU toolchain installed and be
able to build and run a bunch of the software I wanted. My older colleagues
who preferred working with VMS would spend weeks attempting to port some
popular utility to VMS, which seemed goofy to me at the time.

Within a couple of years 386BSD and Linux came along, and my plan to save up
the 5kUSD or so to buy my own DECStation quickly vaporized when I saw that
pretty much everything I wanted to run on Ultrix would build on Linux, so I
certainly had no more interest in VMS after that.

Now a couple of decades later, I sometimes think about the things that I
desperately miss about products like VMS. The expansive, exhaustive
documentation. The robustness of the systems in production, the great
interoperability between the various compilers.

We also got pretty amazing support from DEC, but the amount of money that we
paid for all those things was breathtaking. If you were using custom software
for critical businesses or processes, it was not a terrible arrangement, but
if you wanted to stick an email server in the closet of your small business
and you weren't already a fan of VMS, it would have seemed crazy to choose one
of those boxes. They were very much a premium product in a space where
developers with DIY tendencies where suddenly being presented with heaps of
cheaper choices, even if they weren't as reliable or well-documented.

I'm very curious to see what becomes of the x86_64 port; now that I'm a little
older, more patient, and have a few bits of software that I'd like to stay
running all the time, it might be nice to have the option. I think I even
remember enough DCL to get around :)

------
trasz
And yet just now, several decades down the road, there’s a new, official VMS
port to x86.

------
jheriko
the 'Dying Operating System' not being DOS makes me sad.

------
Upvoter33
These complaints haven't aged very well, and are not as funny as when I first
saw them. He's complaining about case sensitivity?

------
jes
UNIX is a product of evolution and as such is messy, with newer subsystems
trying to reuse and control older, simpler subsystems.

Our bodies are the same way.

------
mwcampbell
This passage at the end was the most thought-provoking for me:

> Unix and C however form a powerful deterrent to the average astronomer to
> write her or his own code (and the average astronomer's C is much, much
> worse than his Fortran used to be). The powers-that-be in the software world
> of course have always felt that "ordinary" users (astronomers in this case)
> should be using software and not writing it. The cynic might feel that since
> those same powers nearly all make their living by writing software, and get
> even more pay when they manage other programmers, then they have a vested
> interest in bringing about a state of affairs where the rest of us are
> reduced to mere supplicants, dependent on them for all our software needs.
> It is clear that Unix does not pose an insuperable barrier --- the ever-
> expanding armies of hackers out there are evidence enough that the barrier
> can be scaled given enough time and enthusiasm for the task. But hacking is
> not astronomy, and hackers are not astronomers, and it is astronomy and
> astronomers I worry about. We shouldn't have to scale the Unix barrier, and
> it is all the sadder because, since the advent of a VMS-based Starlink,
> ordinary astronomers have had something denied to most other scientists in
> this country --- readily accessible, reliable, user-friendly computing power
> that can be easily harnessed to a particular astronomical requirement.

I've spent the past hour or so figuring out what I think about this. It's easy
to dismiss this piece of rhetoric as flawed reasoning or as irrelevant today.
For example, I'm not sure if Unix was ever to blame for the self-serving user-
developer dichotomy discussed in that quote above. After all, the Unix
philosophy is all about small tools tied together with scripting, not the
mega-packages mentioned in the article. And was there really not a good
Fortran option for Unix in 1994? Or was Starlink (or its Unix-based successor)
just too cheap to buy whatever good Unix Fortran implementation might have
existed back then?

But the software development priesthood is still a problem; I'd say it's worse
today. It's what worries me the most about the future of computing for the
next generation, including my nieces and nephew. Will they be able to help
shape the software that runs so much of their lives if they can't land a job
at a handful of tech behemoths? Much of that software is even built on a
foundation of Unix and more generally open source, but the power of that
foundation isn't available to the end-users.

However, the free software movement's answers to these problems are hampered
by their own elitism, which is in fact largely a continuation of the Unix
elitism this article bemoans. It's true that we have much better languages
than C, there are usable GUI desktop environments for free Unixes, and there
are even whole open-source integrated development environments that one can
run on a free Unix. But at some point one still comes in contact with the same
arcane, haphazardly designed command-line environment that this article and
the UNIX-HATERS Handbook rightly ridicule.

So if we actually want a system that's both free and open from top to bottom,
and actually approachable to most people, then maybe we need to finally let go
of Unix, or treat it as only an expedient substrate as Apple and Google do. Or
as another writer put it, we need to free our technical aesthetic from the
1970s [1].

[1]: [https://prog21.dadgum.com/74.html](https://prog21.dadgum.com/74.html)

Disclosure: I happen to currently work for Microsoft, on the Windows
accessibility team. But these opinions are entirely my own, written on my own
initiative. And yes, the fact that I benefit somewhat from the user-developer
dichotomy enforced by the likes of my employer makes me feel uncomfortable.
But I'm not sure quitting would actually help anything.

~~~
gnufx
At least in the previous decade, and even at that time in other parts of UK
science, I don't think there was a "software priesthood". It probably invaded
astronomy and HEP earlier than elsewhere, and to some extent now resides in
Research Software Engineering. The first OS on the nuclear structure
interactive graphics system was written by a Birmingham physicist, for
instance. I wrote analysis software as necessary (and read up on software
engineering as well as the other techniques needed for the research, like
electronics and high vacuum systems). The UK Computational Computing Projects
were run by scientists who did the work, and you could get Fellow of the Royal
Society-level help (though the two mainstays of CCP4 only got the title much
later).

I don't know where this free software elitism is prevalent, as someone
involved with it in research since before the term became necessary. I could
appreciate the UHH, by the way, from a time when Unix was largely not free
software.

------
Commodore_64
This is REQUIEM

------
mrgr1eves
and to think that not long ago around 90% of the world's sms traffic was
processed by VMS ...

~~~
hestefisk
Source?

------
jtdev
I really thought (hoped really) this was going to be an article about Windows.

------
cheph
> Unix and C however form a powerful deterrent to the average astronomer to
> write her or his own code (and the average astronomer's C is much, much
> worse than his Fortran used to be).

Compared to what? If I want to quickly bang out some code and run it,
installing fedora (or another distro) is probbably the least painful way to
get it done.

I would say that Windows is a powerful deterrent to the average person writing
their own code much more so than Linux.

~~~
wizzwizz4
This is about Unix v.s. VMS, not Windows v.s. Linux.

~~~
mwcampbell
That's true. Still, I think cheph is right to point out that the situation has
changed since then. I honestly got so caught up in the rhetoric of the OP that
I had forgotten this for a moment.

~~~
chungus_khan
In particular, we now download whatever software tools we like from this
lovely internet thing, so it's less of a concern what programming environment
the OS bundles by default.

Back then your choices on UNIX were to either accept whatever your vendor
included (which would mostly be C), or buy a disk set of some other tools and
hope they actually worked on your machine.

