Hacker News new | past | comments | ask | show | jobs | submit login

"I became, in fact, less and less of a programmer at all, and more and more simply a procedure-writer who tacked together canned routines or previously debugged POGOL steps to do dull things in a dull way". This was when the author had to work with layers of accidental complexity of the IBM 360s, before discovering Unix. It gives hope to know that such complexity existed even in 1978, but that it was simply bad engineering, as the advent of Unix later proved.

I marked two snippets from Coders at Work where Guy Steele and Ken Thompson both lament the increasing number of layers in modern computing. It is perhaps inevitable, but it is worth wondering at.

##

Seibel: What has changed the most in the way you think about programming now, vs. then? Other than learning that bubble sort is not the greatest sorting technique.

Steele: I guess to me the biggest change is that nowadays you can't possibly know everything that's going on in the computer. There are things that are absolutely out of your control because it's impossible to know everything about all the software. Back in the '70s a computer had only 4,000 words of memory. It was possible to do a core dump and inspect every word to see if it was what you expected. It was reasonable to read the source listings of the operating system and see how that worked. And I did that—I studied the disk routines and the card-reader routines and wrote variants of my own. I felt as if I understood how the entire IBM 1130 worked. Or at least as much as I cared to know. You just can't do that anymore.

##

Seibel: Reading the history of Unix, it seems like you guys basically invented an operating system because you wanted a way to play with this computer. So in order to do what today might be a very basic thing, such as write a game or something on a computer, well, you had to write a whole operating system. You needed to write compilers and build a lot of infrastructure to be able to do anything. I'm sure all of that was fun for its own sake. But I wonder if maybe the complexity of modern programming that we talked about before, with all these layers that fit together, is that just the modern equivalent of, “Well, first step is you have to build your own operating system”? At least you don't have to do that anymore.

Thompson: But it's worse than that. The operating system is not only given; it's mandatory. If you interview somebody coming out of computer science right now, they don't understand the underlying computing at all. It's really, really scary how abstract they are from what a computer is or even the theory of computing. They just don't understand it.

Seibel: I was thinking about your advice to your son to go into biology instead of computing. Isn't there something about programming—the intellectual fun of defining a process that can be enacted for you by these magical machines—that's the same whether you're operating very close to the hardware or at an abstract level?

Thompson: It's addictive. But you wouldn't want to tell your kid to go into crack. And I think it's changed. It might just be my aging, but it seems like when you're just building another layer on top of another layer on top of another layer, you don't really get the benefit of writing, say, a DFA. I think by necessity algorithms—new algorithms are just getting more complex over time. A new algorithm to do something is based on 50 other little algorithms. Back when I was a kid you were doing these little algorithms and they were fun. You could understand them without it being an accounting job where you divide it up into cases and this case is solved by this algorithm that you read about but you don't really know and on and on. So it's different. I really believe it's different and most of it is because the whole thing is layered over time and we're dealing with layers. It might be that I'm too much of a curmudgeon to understand layers.




On a similar note, there is the Niklaus Wirth rant about complexity as well on his ACM Turing Award speech.

http://dl.acm.org/ft_gateway.cfm?id=1283941&type=pdf

To the point his Oberon-07 redesign had as goal a minimalist GC systems programming language.[0]

http://www.inf.ethz.ch/personal/wirth/Oberon/Oberon07.pdf

http://www.inf.ethz.ch/personal/wirth/Oberon/Oberon07.Report...

[0] Too minimalist actually, as many of us might say.


This got me thinking: I wonder if places like St. John's College[1] might eventually take up teaching directly from Turing and von Neumann. Maybe one day those sorts of great-books focused liberal arts colleges will be the only places left teaching the low-level stuff and the basic principles of computing, outside a handful of hyper-focused engineering programs designed to meet the limited need for some people in the industry who understand it well.

[1] https://en.wikipedia.org/wiki/St._John%27s_College_(Annapoli...

(I can't speak to the rigor of that program from direct experience, but look at the reading list—it's not all the fluffy philosophy and literary criticism that many imagine when they think of liberal arts programs. They cover the major Greek mathematicians in the first two years, then it's off to enlightenment-era mathematics—John von Neumann or Turing, and maybe even something like TAoCP, wouldn't be out of place)


The on St. John's grad I know well used to have a copy of a book including "On Computable Numbers" on his mantelpiece. He has worked in the tech business for many years.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: