The book which taught me to "really" program is certainly the quirkiest that I've ever read. "Learn to Program with Visual Basic 6" [0]
My dad purchased it for me when I was nine. He didn't know how to program, but he took a stab in the dark. I diligently read through the whole book and worked through all of the lessons. It's written in a conversational, narrative style about a college course where the students produce a piece of software for a china shop.
I think you'd be hard-pressed to actually derive much value from this book unless you're keen to learn an antiquated version of Visual Basic and have the patience for a book targeted at absolute beginners, but it's definitely quirky. And for me, holds a lot of sentimental value.
Not a computing book with code but it taught me a lot about what can be done with computers, and that they are marvelous machines.
- Architecture of Symbolic Computers - Kogge
An ode to what could have been if RISC and x86 hadn't steamrolled the landscape.
- Thinking Forth - Leo Brodie
Forth is wild, this book is wild, it's really a different world. After multiple attempts over 25 years, I'm giving it another go this year and this time it is actually clicking.
- Malicious Cryptography: Exposing Cryptovirology - Adam Young, Moti Yung
This is way out of my league in terms of cryptography, but this fascinated me early in my career, and led me to implement one or the other algorithm. Very prescient in many ways.
- Patterns of Software - Richard Gabriel
Not directly about computing, but I got a lot of deep insights from this book. It also brought me to the next book:
- Notes on the synthesis of form - Christopher Alexander
This is not a computing book per se, but it goes to the core of system architecture. Even in its original field, this book is kind of wild.
- All Mathematica Guidebooks - Michael Trott
Mathematica is wild, these books are wild, they are old but still oh so inspiring.
> Mathematica is wild, these books are wild, they are old but still oh so inspiring.
Great recommendation. Are these so old? I skimmed through [1] and they seem still on print and relevant?
What is other inspiring Mathematica literature? I have been considering Mathematica to cover some of the gaps Julia (and R or Python have). Particularly in the symbolics camp [2].
It's also nice Mathematica is free on Raspberry Pi, but it might be too slow to be of any practical use.
I should maybe have used another word. Mathematica has changed a lot (mostly by accruing functionality) that a lot of these examples seem old in the way they're written. The content is just as good as ever, and nothing beats fundamentals anyway.
Very good list, I look forward to diving into those that I've not had any exposure to. Without a doubt Kogge and Brodie will appear on my list one day.
I see a few mentions of beginner-level books that have struck a chord with people here and I empathise, as by far the computer book with the most formative impact on me was the Macintosh Bible 4th ed.
Certainly quirky, it was also one of the best attempts to capture why the Mac was different, why UX mattered, and codified for a young me a certain way of thinking about apps just as much as the Hacker Dictionary did for an earlier era.
I think one of the first "quirky" computer books I can remember is a user manual for the Epson MX-80 dot matrix printer from the 1980s. It was written in a much more relaxed, personable style than almost any other computer books of the time. https://www.apple.asimov.net/documentation/hardware/printers...
But if it’s the same manual I had for that same printer, it was hysterical. For example at one point they showed how to generate arbitrary graphics for the 7 (I think) pin head by encoding as binary. Then just before the next section the manual said something like “now before you run off to forge a copy of the Mona Lisa…”
That's the one. It was full of those fun little notes. A nice balance of humor without going over the top. Helped me figure out how to use all of the features of the printer.
It's entirely hand-lettered. The author's afterword remarks that an attentive reader will notice the shakier lettering toward the end of the book.
He wrote a couple other books on programming (I think he did Illustrating Pascal). It's a quirky approach, but very friendly and unintimidating compared to vanilla textbooks.
I came here to ask if anyone could remember this weird handlettered book on BASIC which illustrated all kinds of concepts including linked lists using arrays of boxes and arrows. I believe this must have been it unless there’s another book that was like this. I vividly remember as a kid reading all these different books about computers and programming. I found that I had to read a bunch of books in order to find the right mix of explanations that gelled in my mind. Instead of reading one or two books on BASIC and getting frustrated, I probably read a dozen. This one really appealed to my visual brain.
I was just going to comment on this book; one of my favorites as well. The follow on book "Thinking Forth" was similarly mind opening. http://thinking-forth.sourceforge.net/
I have a book on Forth, which I currently can't remember the title or author of but it had a blue cover a bit like Starting FORTH (I'm pretty sure it wasn't this) but was typeset on a 7-pin dot-matrix printer. Literally all the body copy is dot-matrix.
Oh, also the Jupiter Ace programming manual, which starts off with a very simple introduction to Forth and by around Chapter 22 has circuit diagrams for homebrew peripherals.
My favorite quirky computing book was something I found in the library by chance at my university when I should've been attending a Principles of Programming Languages course.
We'd been learning Prolog in class for the past two weeks but I'm terrible at learning from lectures, so eventually I decided it'd be a better use of my time to locate a book I could teach myself from rather than doodling in class.
IIRC it hadn't been checked out since the 80's: it was a slim volume on Prolog with an Alice in Wonderland theme. I can't remember the title or anything, but it was an enjoyable read, and effective: I still hadn't written any Prolog at the time of the exam—which I remember was 4 days out at the time I picked up the book— but I understood it well enough by then to solve all the problems without flaw including some extra credit challenge problem :)
That looks like a nice book, but definitely not it (I may read the datalog bits though—thanks!).
The one in the library was ~200 pages, solely on prolog, Alice in Wonderland not only on the cover art, but constantly used throughout the writing itself.
Following the numbered citation, I think this is the table of contents from a different book, originally published in Finnish by Jaak Henno, which would have an English title something like “It is simple with Prolog!”. He also seems to have published some of the content separately under the title “Prolog and Olympic Gods”.
How long ago did you check it out? If it was recent enough, then the library you checked it out from would have record of it. I'd love to know what the book was!
I've searched and searched and can't find any Prolog book that matches this description. Hopefully you're able to find or remember it.
Unfortunately it’s been about a decade and the library is in another state from where I now live. I did another round of searching myself and couldn’t turn anything up either :/
It's not as high-minded as the examples in the link, but as a kid I really enjoyed Woody Leonhard's "Mother of All..." books about Windows (the 3.1 and 95 ones, specifically).
Woody introduced a series of characters, each with their own personality & level of familiarity with Windows itself, then used them in asides to explain things. The great thing about those books (and something I seldom see anymore) is that they were really great about riding the line between 'the power button is the button you press to turn the computer on' and 'the A20 gate defines when low memory etc. etc. etc.'
Having the characters gave a great way to get super-deep into minutiae but let the reader know they could skip if it didn't interest them, plus their interactions with each other were really fun.
Also, and I'm sure most people know him already, I always really loved how David Pogue would put weird little stories or dialog in the examples he'd give when demonstrating a program. Like I think Macs for Dummies had a bit where his Word examples had a really flowery story about a guy riding a rollercoaster or something. Really influenced me, whenever I create a demo UI or example page I try not to use boring "This is example text" or "Lorem ipsum."
The Microwriter is a one-handed portable digital word processor from 1978. It has an excellent 'New user's guide' that is easy-to-read and full of humour too.
The user guide is written as a conversation between the "author" and a cartoon picture of a key (button) with a face, arms and legs. The key represents the impatient and inquisitive user who can't wait to start using the device. It's worth remembering that most users where unfamiliar with computer jargon:
<author> Don't worry: correcting is easy but first you must learn about COMMANDS.
Aside: Bill Buxton, the computer scientist and designer, had this to say of the guide: "I think that the New User’s Guide is one of the best examples of technical writing that I have ever seen in a user’s manual. I love the parallel use of different representations to get the message across. I really appreciated it when I was learning, and think that the manual is worth studying for its approach."
After going to college for journalism and graphic design in the early 90s, I decided to become a programmer, despite not having done it since I was in grade school (though, I was pretty hot shit on my TRS-80). My first book on the topic was, in retrospect, pretty crazy, but I've been a tech professional for 25+ years now, so it must have been useful. I'm speaking of The Cartoon Guide to Computers [1][2]. If there's a more quirky (but useful!) computing book than that, I'd be pretty surprised.
I remember specifically learning about flip-flops from the book, and wondering what sort of crazy magical physics made it work. It took me 20 years to finally learn it simply has to do with the timing of the electrical signal as it propagates around the gates [3].
Re-reading it again now, it's actually a surprisingly thorough overview of the history and fundamentals of computer science. I had forgotten.
Writing Interactive Compilers and Interpreters by Peter John Brown (1979) is quirky in the sense that he paid unusual attention to the literary quality of his prose.
Among his admonitions for writing user manuals is this:
Read. Books on overall style are a matter of personal taste. Our own favourite is Strunk (1959). As examples of good writing on technical matters, we especially like the works of D. E. Knuth and of M. V. Wilkes, and also the book "Software Tools" (Kernighan and Plauger, 1976). Perhaps the most pleasant of all writing on technical matters is in the field of gardening, not computing; read "The Small Garden" by Lucas Phillips (1952) and not only your turnips but also you writing will improve.
I appreciate the effort but other than the play on Ed woods movie title, I'm not exactly sure what makes this book "quirky". Skimming through the first dozen pages available for free, it struck me as a relatively academic and dry treatise on scheme.
You will have to dive a little bit deeper. The S9fES described in the book is a tree-walking interpreter with decimal real-number arithmetic and a few other interesting (in an almost brutalist sense) design decisions. "Dry and academic" and "quirky" do not exclude each other, but you have to enjoy the subtle points and not expect to be hit over the head with superficial weirdness.
The other commenter said that it's a joy to read and that's certainly true. If you're interested in software folk tales and such, it's worth getting.
I still see it recommended as a practical book however and indeed the book bills itself as "the second book you need on C", the book that will cover topics that other C books don't explain or explain poorly. But it's much too outdated to serve that purpose. In practice it means chapters discussing differences between K&R C and ANSI C and deep-dives into details of SunOS and MS-DOS compilers.
Some of the material has become misleading because the C language has evolved. For example, there's a lot of discussion about pointers and arrays, as can be expected. But there's no mention of strict aliasing and pointer provenance (these rules existed in C89, but compilers at the time didn't exploit them yet for optimization). And of course no variable-length arrays (introduced in C99). So you're not going to learn what you need to know in today's world.
The book also has almost no discussion about safety. The Morris worm is mentioned, but just as a piece of historical trivia. Browsing it now, I don't even find any discussion about buffer overflows.
Fairly fresh into comp arch, this book really highlighted for me the interaction of HW and SW, and how from a minimal arch we can build and improve it. I'm still sad that stack modern architectures didn't take off as Koopman envisioned it.
Elements of Programming by Alexander Stepanov and Paul McJones gives a very engaging introduction to applying basic abstract algebra to algorithm design.
Somehow my second or so book that I read on computers was "Bebop Bytes Back: An Unconventional Guide to Computers," which made an impression. An absolute doorstop of a book (nearly 900 pages), it starts from logic gates and builds from there.
Felt like I was really learning some important arcane knowledge reading this and testing out the commands in grade school.
At the time of purchase, the computer shop owner suggested I was wasting my time because computers would be programming themselves in ten years, or something like that.
Describes the motivations for and design of a massively parallel computer with many (millions? 64k?) of very simple processing nodes that serve as both memory and CPU. Shows how you can have "active data structures" where each item of data is in a different CPU (or clusters of nearby CPUs) and literally migrates between CPUs as computation proceeds. I think it gives an example of the parallel bitonic sort algorithm being executed in this way. A relatively thin book and relatively easy to read for a programmer.
In the 80s or early 90s I read a surprisingly deep Logo programming book whose title/authors I forget. Not Turtle Geometry, but similar in coming from MIT Lisp/Scheme/AI people. One of the projects near the end was a kind of basic proof assistant. There was some discussion of OOP. It was written to go with some particular microcomputer Logo system.
Certainly a different kind of quirky, but Knuth's The TeXbook is a very fun read. I regret I don't have more experience with manuals, thanks to this one.
While at the library a while ago, I ran into a book on VRML (the Virtual Reality Modeling Language from the ‘90s). That was a doozy.
It looked like they hadn’t bought any new programming books in a while either; there didn’t seem to be any newer than about 2010. They also had a book on programming with Python on Symbian (Nokia’s pre-iPhone/Android smartphone OS)!
Could add The Systems Bible (aka Systemantics) by John Gall. It certainly qualifies as quirky - I think I saw it described somewhere as the book that system designers read under a sheet with a torch at night to avoid being seen with it.
I think (based on the books he lists), he's not talking about books with a quirky presentation (like "Land of Lisp" or "Learn You a Haskell For Great Good") but rather a quirky topic. Expert systems are already a bit of a niche topic these days (not the trendy part of AI) but writing expert systems in FORTH of all things...
Yeah, LoL was an interesting experience. I could write shorter Python versions of most of the lisp code in the book, but it was an enjoyable introduction.
If I recall correctly, the common lisp community doesn't like nor recommend the book at all.
I remember going through the first 6 chapters and often going on the CommonLisp channel on free node to ask questions and eventually found out it was either wrong or misleading in many of the fundamentals. I tried to keep going through it but eventually gave up and picked the much better practical common lisp.
The book is good if you realize what it is -- it's a book by somebody relatively new to LISP who is excited by it and wants to share his enthusiasm with others. People who are more experienced LISP programmers may dislike it because the author has a rather idiosyncratic coding style that doesn't conform to accepted best practices in the LISP community.
It's been awhile since I've read it, but I seem to recall that Barski had been coding in Lisp for many years before publishing the book. Also keep in mind that many in the community seem to like the book and the exposure it gave to the language. I think the lisp alien is even the logo of the lisp subreddit.
I'm interested to know more here. I don't think of the book as a reference, but it was certainly fun to read and helped spark my interest in common lisp.
To be fair, I had similar experiences with 4-5 other lisp books I read. For the kinds of uses I have, I think Python is just more convenient. If I actually needed to write some compiler or grammar processing app or something like that, I'd bet Lisp could win, but I don't have those needs at present. It was a great learning experience though.
While it is written in an adolescent confrontational style that doesn’t help the author’s cause, it is otherwise a very thorough introduction to Lisp macro programming. The last chapter is about writing a Forth compiler in Lisp. Great stuff.
I am a bit late at the game but let me contribute this:
Programming Pearls - by Jon L. Bentley (1986)
ISBN:9780201103311
It's an amazing collection of... small stories (the chapters were originally published as ACM articles I believe).
Nowadays they read like a crossover between Codegolf and Martin Fowler's collected articles, but I still like to pull by paper copy out every 10 years or so and reread it.
(There was also a second volume, and I believe a "re-edition" published in year 2000, but I'd go for the original for its specific charm).
I had programming in school but kind of lost interest and studied political sciences instead.
When I tried to find a masters thesis idea I found a smallish book about simulating growth of early nation states by using computer simulation of various factors, e.g. internal cohesion as a function of time.
It was totally wild and fascinating at the same time. It did inspire my masters thesis and rekindled my interest in IT.
This is an ongoing project so there are many in my personal library that need coverage. That said, _why’s guide is probably too well known for this list.
I liked the "C for Dummies" book because it was intentionally whimsical to an extent. It was a bit too insubstantial (could have done a lot more about pointers and complex data structures) but I give them credit for starting with a "Goodbye, Cruel World" programme instead of "Hello World".
You can also favorite comments if you click on the timestamp.
I have 1785 favorited submissions over the 5 years since I found the feature; I really need to capture all of them and their links, would be a shame to lose all of that information if HN goes away.
My dad purchased it for me when I was nine. He didn't know how to program, but he took a stab in the dark. I diligently read through the whole book and worked through all of the lessons. It's written in a conversational, narrative style about a college course where the students produce a piece of software for a china shop.
I think you'd be hard-pressed to actually derive much value from this book unless you're keen to learn an antiquated version of Visual Basic and have the patience for a book targeted at absolute beginners, but it's definitely quirky. And for me, holds a lot of sentimental value.
[0] https://www.amazon.com/Learn-Program-Visual-Basic-6/dp/19027...