Hacker News new | past | comments | ask | show | jobs | submit login
The Joys of Unix – NSA Cryptolog (1978) (archive.org)
78 points by Mithrandir on March 6, 2015 | hide | past | favorite | 16 comments



Interesting that he praises the RAND editor. It's a branch of editors that has entirely died out as far as I can tell, on Unix at least. The source code for RAND E19 can still be found, but I couldn't get it running.


I also failed to find recent information on RAND's "e" editor, but from reading the article the interface/commands seem extremely similar to Vi.


Crytolog magazines are always interesting.

One thing that intrigued me were the mentions of the POGOL language. There seem to be no mentions of it on the internet, which makes me suspect it is either an arcane language specific to a 3-letter agency. That agency may either be IBM or NSA.. both seem equally as likely.


There were a lot of ALGOL variants around in the 1960s & 70s if you look at the Wikipedia page for ALGOL, including one called GOGOL for the PDP-1. My first guess would be that POGOL was an ALGOL-a-like for the IBM-360, perhaps developed internally by the NSA?


I would assume an ALGOL variant.

Given that this was in '78 (which is quite late in ALGOL's lifetime), it's possibly the "P" referred to the P-porting kit for Pascal[1]?

That's speculation of course, but the ALGOL name variations generally did mean something.

[1] http://en.wikipedia.org/wiki/Pascal_(programming_language)#T...


All this on a minicomputer that costs a mere $300,000? No way!

Off the top of my head, I'd guess that would run you $6m today.


Does anyone happen to know why the shell got named a "shell"?


Just speculation, but perhaps it's because it "wraps" the kernel. ie. the kernel's shell...


That was always my understanding. In older mainframe operating systems, the command interpreter was more integral to the OS. In Unix, it's just a process. Its function is to run programs for the user, so it is in some sense a shell around the OS seen by the user.


That's the way I see it and to extend the analogy a bit further Unix, et el. are like a hermit crab that can trade one shell for another if there is a good reason to do so.


Yup, I've always thought it provides sort of an "airlock" generic land between userspace and kernel-land, where people can do things like send signals to processes and also do stuff like dump the contents of a file for easy viewing.



"I became, in fact, less and less of a programmer at all, and more and more simply a procedure-writer who tacked together canned routines or previously debugged POGOL steps to do dull things in a dull way". This was when the author had to work with layers of accidental complexity of the IBM 360s, before discovering Unix. It gives hope to know that such complexity existed even in 1978, but that it was simply bad engineering, as the advent of Unix later proved.

I marked two snippets from Coders at Work where Guy Steele and Ken Thompson both lament the increasing number of layers in modern computing. It is perhaps inevitable, but it is worth wondering at.

##

Seibel: What has changed the most in the way you think about programming now, vs. then? Other than learning that bubble sort is not the greatest sorting technique.

Steele: I guess to me the biggest change is that nowadays you can't possibly know everything that's going on in the computer. There are things that are absolutely out of your control because it's impossible to know everything about all the software. Back in the '70s a computer had only 4,000 words of memory. It was possible to do a core dump and inspect every word to see if it was what you expected. It was reasonable to read the source listings of the operating system and see how that worked. And I did that—I studied the disk routines and the card-reader routines and wrote variants of my own. I felt as if I understood how the entire IBM 1130 worked. Or at least as much as I cared to know. You just can't do that anymore.

##

Seibel: Reading the history of Unix, it seems like you guys basically invented an operating system because you wanted a way to play with this computer. So in order to do what today might be a very basic thing, such as write a game or something on a computer, well, you had to write a whole operating system. You needed to write compilers and build a lot of infrastructure to be able to do anything. I'm sure all of that was fun for its own sake. But I wonder if maybe the complexity of modern programming that we talked about before, with all these layers that fit together, is that just the modern equivalent of, “Well, first step is you have to build your own operating system”? At least you don't have to do that anymore.

Thompson: But it's worse than that. The operating system is not only given; it's mandatory. If you interview somebody coming out of computer science right now, they don't understand the underlying computing at all. It's really, really scary how abstract they are from what a computer is or even the theory of computing. They just don't understand it.

Seibel: I was thinking about your advice to your son to go into biology instead of computing. Isn't there something about programming—the intellectual fun of defining a process that can be enacted for you by these magical machines—that's the same whether you're operating very close to the hardware or at an abstract level?

Thompson: It's addictive. But you wouldn't want to tell your kid to go into crack. And I think it's changed. It might just be my aging, but it seems like when you're just building another layer on top of another layer on top of another layer, you don't really get the benefit of writing, say, a DFA. I think by necessity algorithms—new algorithms are just getting more complex over time. A new algorithm to do something is based on 50 other little algorithms. Back when I was a kid you were doing these little algorithms and they were fun. You could understand them without it being an accounting job where you divide it up into cases and this case is solved by this algorithm that you read about but you don't really know and on and on. So it's different. I really believe it's different and most of it is because the whole thing is layered over time and we're dealing with layers. It might be that I'm too much of a curmudgeon to understand layers.


On a similar note, there is the Niklaus Wirth rant about complexity as well on his ACM Turing Award speech.

http://dl.acm.org/ft_gateway.cfm?id=1283941&type=pdf

To the point his Oberon-07 redesign had as goal a minimalist GC systems programming language.[0]

http://www.inf.ethz.ch/personal/wirth/Oberon/Oberon07.pdf

http://www.inf.ethz.ch/personal/wirth/Oberon/Oberon07.Report...

[0] Too minimalist actually, as many of us might say.


This got me thinking: I wonder if places like St. John's College[1] might eventually take up teaching directly from Turing and von Neumann. Maybe one day those sorts of great-books focused liberal arts colleges will be the only places left teaching the low-level stuff and the basic principles of computing, outside a handful of hyper-focused engineering programs designed to meet the limited need for some people in the industry who understand it well.

[1] https://en.wikipedia.org/wiki/St._John%27s_College_(Annapoli...

(I can't speak to the rigor of that program from direct experience, but look at the reading list—it's not all the fluffy philosophy and literary criticism that many imagine when they think of liberal arts programs. They cover the major Greek mathematicians in the first two years, then it's off to enlightenment-era mathematics—John von Neumann or Turing, and maybe even something like TAoCP, wouldn't be out of place)


The on St. John's grad I know well used to have a copy of a book including "On Computable Numbers" on his mantelpiece. He has worked in the tech business for many years.




Applications are open for YC Summer 2021

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: