HAKMEM instilled in me a lifelong love affair with the architecture of the PDP-10.
This memo was quite influential on 14-year-old me, mainly because I could not understand a word of it. It caused me to look many things up (which was hard in the pre-web 70s) and served well to show me the depths of my lack of understanding of the size and shape of the world. I ran into Gosper on one of his vists east and though didn’t ask him about munching squares, I did receive an unsolicited and incredibly valuable lesson on using the lispm debugger and DDT. He’s still a friend today. RG was also highly influential, though in the opposite way (he assumed I understood far more than I did so would pass along crumbs in order not to waste my or his time). Nelson I’ve never met.
Are there still opportunities for random kids to be exposed to such things? It feels like despite there being several orders more information now at your fingertips that life has become more professionalized and constrained.
The items and examples are so
sketchy that to decipher them
may require more sincerity and
curiosity than a non-hacker can
muster.
They then go one to contemplate the intricacies of peg solitaire, at which point your mind can’t help but juxtapose submarine-based global nuclear strategy with goofing off on placemat games at all-night diners over a plate of french toast.
And with that you have to try and gain some perspective. Geeking out about fractals and pentominos and mersenne primes is where it all really starts. Deceptively simple fundamentals are the things that ignite our curiosity.
To try to package it up in a ready made kit can spoil some of the discovery. It’s really about stumbling onto unassuming riddles that don’t give away a hint of anything that you might anticipate as a rewarding surprise, only to encounter something of an anticlimax, if perhaps you sense that it will be spelled out for you. The part that leaps out at you, sometimes, needs to do so with a degree of authenticity.
When I was around 10 in the mid 90s, I started trying to read my mom's copy of John Fraleigh's "A First Course in Abstract Algebra". It gave me the same feeling of realizing the sheer depth of how little I knew, and was one of the things that piqued my interest in math and physics.
As long as parents keep interesting books in their shelves, I think we'll be fine.
For most disciplines, there are places on the web where esoteric information can be found. Wikipedia is very strong on math and computer science topics, and I recognize the feeling that you are describing of realising how much there is to learn. The hyperlinked nature of Wikipedia amplifies this.
The phenomenon of professionalization has probably made computing less of an intellectual activity, and more of a routine business. Is that what your perception is? It doesn't help that the signal-to-noise ratio is now much lower.
> The phenomenon of professionalization has probably made computing less of an intellectual activity, and more of a routine business. Is that what your perception is?
Not just computing. Compared to my childhood in the 60s/70s (I'm gen X, though my early childhood was not in the US), life is highly professionalized. Sport used to be played by everyone, now even school kids specialize and have fancy training. Music is less ad hoc though admittedly that had become a factory product by the 60s.
And it was easier to tinker. Nowadays people talk of "building" a PC when they are just gluing together lego blocks. That is great in that it opens up the opportunity for many more people to program, but at a cost of hardware tinkering. It used to be obvious by looking at a car motor what was going on. Now they are boring.
When video games were starting they were a gateway to programming for lots of kids. Now they are part of an ecosystem of toys that play with the kids (rather than the kids playing with them) and are generally part of an integrated marketing ecosystem ("franchise") that has characters and storylines that integrate film, video, toys...where is open ended imagination? Fandom is more consumption of someone else's thinking now than it used to be.
I know, sounds like a curmudgeon's lament.
> It doesn't help that the signal-to-noise ratio is now much lower.
I am not sure that's true. There's a survivorship bias in that we still read the work of the greats, while most of the crud is lost. I recently tossed out several shelves worth of old CS papers from the 60s-80s that I knew I'd never look at again. Dead end CPU designs, absurd (in retrospect) claims, all sorts of crud.
There is lots of fabulous thinking from the ancient Greeks, but what's survived is an infinitesimal portion of what was produced...I assume none or hardly any of what perished was worth re-reading.
Is there anything about this lesson on using the Lispm debugger that you could convey in a comment? I have a Lisp machine myself, and am painfully aware that there must be a lot of lore that is not adequately documented, and am trying to preserve as much as I can.
I wasn't using the debugger at all, and Gosper showed me what a great tool it was (note: Plauger disagrees). What was great about the Lisp debuggers (MACLISP, MIT lispm, and Interlisp-D were the only Lisp ones I ever used extensively, though I also found this functionality really handy in PARC Smalltalk) is that they were part of the environment so from the debugger (or another window, like your editor, in the case of the Lispm and D machines) you could write little routines to explore the problem and, update a buggy function, and the continue from wherever you wanted in the stack.
In this they were like the assembly debuggers, like DDT which was our "shell" on the ITS machines.
Compare this to the Unix paradigm where you have to explicitly decide to run under debugging (or at best dredge through a core dump): you often can't really look at what caused the problem since whatever caused it (open files, network connections and the like) are gone; your debugger runs in a different environment (process) from the program you're debugging, and doesn't have the same program affordances, i.e. no programming language. You can't imaging how many people were blown away in the 90s by the ability in gdb to simply call a function in your target program when stopped at a breakpoint. To me that has always been basic functionality.
It's rather shocking and frustrating to me that the development environments today are in many ways much more primitive than we enjoyed in the 1970s! I bang my head against the limitations of VScode and Xcode and go back to Emacs which at the end of the day is sadly just as powerful.
I used Smalltalk for a while, so I at least had this rich debugger experience there: being able to change code whilst halted at a breakpoint and resume with the new code taking effect immediately, no world recompilation needed; being able to compose complicated queries in the debugger to do consistency checking of live data structures I was looking at, stuff like that. The thing is, I did not learn any of that from a book, I had to learn it from someone else who knew it.
Thanks for the details, sounds like I need to look at DDT too.
I can't remember where I read this, but I think it was about Marvin Minsky, a story about programming in the debugger, where the subject started out with an empty program, ran it which immediately put him in the debugger, and there he just built up the program a statement at a time to handle whatever the program needed to do at the step it was currently at, and it would just dump him to a breakpoint every time it got to a part where he had not yet specified what was to be done. The whole program was eventually composed in one debugging session starting from an empty program. I suspect this must have been done in DDT, or something much like it. Does that sound plausible to you?
I have a friend who scorns exploratory programming a “programming by successive approximation” a statement which I have to admit has some truth to it. This story sounds like something he might say.
I found the reference [1]. I think this style has its place, like when you don't know exactly what you need in general, but can make concrete decisions as you encounter them. Kind of like the rough sketching of programming. I wouldn't want to fly in a plane whose control software was written this way though.
The HAKMEM (the Hacker Memo) paper by Bill Gosper, details work done at the MIT AI lab by the first known code hackers.
Stories about the MIT hackers (Bill Gosper,Richard Greenblatt, and Stewart Nelson) are detailed in Steven Levy's book about the beginnings of the hacker culture.
The chess problem described in item 70 always surprised me. It just didn't seem worthy of inclusion. A few years ago I happened across the book by Lasker that it was drawn from and found that they'd made a typo in their description of the problem (bishop should be at KB8, not KB7), one that still makes it a mate-in-3 as described, but nowhere as interesting as what it should have been.
I wrote it up in /r/chess: Problem 66 from Chess for Fun, Chess for Blood by Lasker: white to mate in 3 [1]
I’m trying to make sense of what this document is: it seems to me random trivia that the MIT AI Lab produced in a relatively unstructured format? Am I understanding this right?
> A legendary collection of neat mathematical and programming hacks contributed by many people at MIT and elsewhere. (The title of the memo really is “HAKMEM”, which is a 6-letterism for ‘hacks memo’.) Some of them are very useful techniques, powerful theorems, or interesting unsolved problems, but most fall into the category of mathematical and computer trivia.
"Solving any of the problems was rewarded with prizes ... For problem 153, which was later recognized as being closely related to Stefan Banach's "basis problem", Stanisław Mazur offered the prize of a live goose. This problem was solved only in 1972 by Per Enflo, who was presented with the live goose in a ceremony that was broadcast throughout Poland."
Some of the memos contain original mathematical research of high quality: for example, Gosper's algorithm for continued fraction arithmetic is original [1], even mathematicians like Lagrange did not mention anything about arithmetic operations on continued fractions [2].
I have read HAKMEM (I think I had first read it in the twenty-first century), and I like this. I don't know PDP-10 programming, but I hope that I can learn, so that I can understand some more of the stuff in the HAKMEM, some of which is difficult if you do not know PDP-10 programming. Is there emulation available?
This memo was quite influential on 14-year-old me, mainly because I could not understand a word of it. It caused me to look many things up (which was hard in the pre-web 70s) and served well to show me the depths of my lack of understanding of the size and shape of the world. I ran into Gosper on one of his vists east and though didn’t ask him about munching squares, I did receive an unsolicited and incredibly valuable lesson on using the lispm debugger and DDT. He’s still a friend today. RG was also highly influential, though in the opposite way (he assumed I understood far more than I did so would pass along crumbs in order not to waste my or his time). Nelson I’ve never met.
Are there still opportunities for random kids to be exposed to such things? It feels like despite there being several orders more information now at your fingertips that life has become more professionalized and constrained.