Over the years I learned most of what is in the early OReilly book below through daily use of the commonly used programs it describes.
However I still go back to this book for the troff examples. The troff manuals I have seen are fine by me, but I believe this book provides more "working examples".
Whenever I read this book again, I am reminded of how much UNIX know-how is probably being lost to new generations that are steered away from learning the foundations that still make up their operating systems.
(No matter how many programs have been or will be subsequently written to "replace" them, it is still not difficult to find these old programs on millions of computers. Avoiding any debate of why they are there, it is uncontestable that when one knows how to use them, they work wonderfully for the simple things that computer users will always need to do.)
Be warned, if one demonstrates the effective use of such old programs to others who have committed themselves to todays large, complex software, the reactions may not be positive. They might be dismissive or insulting, they might accuse one of being a "luddite", or perhaps a "neckbeard" practicing some unexplained type of "elitism".
The uninitiated person who sees the results of UNIX text processing as nothing short of "magic" is rare indeed. It is that person who might enjoy this book.
Indeed. With Linux in particular there is a growing sub-set of the community that see the unix inheritance as something to be ignored or even vilified.
>the unix inheritance as something to be ignored or even vilified.
And I agree with them. Unix was never a great or even a good system. It was just a way to have a multiuser system on a cheap hardware.
Already in the late '70s, UNIX was behind the state of the art, compared to the Xerox systems and -even more- the Lisp Machines. Unix was technologically a regression.
See "The UNIX Haters Handbook."
Note: I have worked as a Linux sysadmin in the early 2000s. I used to love it.
I'd love to know the story about how # ended up being DEL and DEL ended up being SIGINT. There must be some funky terminal hardware explanation.
My very first Unix experience was bash on a VAX 11/785. Which was great until the time I sat at the teletype console for that VAX and typed Ctrl-P to bring up the previous command line I'd typed. On that console, Ctrl-P is "halt immediately".
Hi Nelson! The # and DEL conventions came from Multics.
Note that from the ASCII perspective, DEL does not mean "delete the previous character." It is meant to be overpunched over a character, effectively deleting the character under the cursor, not before it.
In spite of this, the first DEC machines to use ASCII immediately seized DEL for command-line editing, because it and ESC were the only non-printing characters to have their own keys.
Meanwhile CTSS had begun operating on a variety of terminals and had to use whatever varying features they had available. It adopted " for character erase, ?, :, ~, or _ for line erase, and BREAK, #, or DEL for interrupt, depending on what was available. The second edition of CTSS standardized on ", ?, and BREAK as terminals became more standardized.
In 1966 Multics had begun to be prototyped on CTSS. The TYPSET editor in particular adopted # for character erase and @ for line erase. (There is no reference to a change in triggering interrupts at this time.) I am guessing this is because " and ? were more useful than # and @ in general text, but there is no documentation for the reason.
Multics adopted the neo-CTSS # and @ conventions, and added an elaborate input canonicalization layer that worked on a stream of characters. I think the canonicalization buffering was what led to treating interrupt in the kernel as a character rather than as an out-of-band signal. Notably canonicalization handled the backspace character, but used it for underlining and overstriking rather than for corrections, contrary to our expectations now.
Anyway, Unix adopted the Multics input conventions, with a much simpler canonicalization layer, and kept them until BSD brought Unix to people who were used to DEC operating systems. They retrofitted the DEC conventions of DEL, ^U, and ^C into Unix, where they stuck into the present.
Our expectations now, unfortunately, still should be that backspace is for underlining and overstriking. The man system still uses the TTY-37 conventions, for starters. Indeed, groff, which when it was created was made capable of using ECMA-48 control sequences, was in the late 1990s and early 21st century forcibly dumbed back down to TTY-37. FreeBSD recently switched its manual processor from groff to one that has no ECMA-48 capability.
And of course after actually reading the article, that's what he used. Oh well, here's a link for the 99%'s that don't read the article more than the headline (myself included).
They could likely sign up an account with the Living Computer Museum in Seattle, and possibly find an appropriate machine running actual Unix V7.
Aside, the Living Computer Museum is amazing, and well worth a trip for anyone. Most of the computers are set up with the expectation that you can play with them.
I started picking apart the kernel and had a bunch of failed reboots because of it in my V7 installation. I'm fairly certain the LCM wouldn't be thrilled by that idea.
That said, this is the first I've heard of the LCM and I'm absolutely in love with it. Unfortunately, it seems skiddies with botnets are, too, ruining availability and ease of access for everyone[1]. I wish I could take a trip to the physical location sometime, too. The DareNET Archives have tried to do the same for IRC a while ago, but in the end, security concerns, difficulties selecting relevant historical IRC software and lack of time made it fairly infeasible.
It doesn't run automatically. I thought it was an interesting experiment as a 'tip jar' type thing. No I really don't expect to make even $0.01 but it's more so to gauge if anyone would actually click the button.
It may load the mining thing, but you have to hit the button, and even then it runs for something like 5-10 seconds.
I hate ads, but I thought stuff like this was an interesting alternative.
ed(1) is actually quite usable, I remember editing lots of documents using ed/troff at university in the early 80s as still occasionally drop into ed if I wasn't to do a small change to a simple txt file (I remember that using vi was frowned on as an egregious waste of memory)
Funny enough, a couple of weeks ago I tried Unix V7 and the disk image I had didn't have vi... So ed it was! Learned just enough ed to write and compile Hello World using the system C compiler. Super fun!
However I still go back to this book for the troff examples. The troff manuals I have seen are fine by me, but I believe this book provides more "working examples".
http://www.oreilly.com/openbook/utp/
http://www.oreilly.com/openbook/utp/UnixTextProcessing.pdf
Whenever I read this book again, I am reminded of how much UNIX know-how is probably being lost to new generations that are steered away from learning the foundations that still make up their operating systems.
(No matter how many programs have been or will be subsequently written to "replace" them, it is still not difficult to find these old programs on millions of computers. Avoiding any debate of why they are there, it is uncontestable that when one knows how to use them, they work wonderfully for the simple things that computer users will always need to do.)
Be warned, if one demonstrates the effective use of such old programs to others who have committed themselves to todays large, complex software, the reactions may not be positive. They might be dismissive or insulting, they might accuse one of being a "luddite", or perhaps a "neckbeard" practicing some unexplained type of "elitism".
The uninitiated person who sees the results of UNIX text processing as nothing short of "magic" is rare indeed. It is that person who might enjoy this book.