Also, why is it suddenly in vogue to bash TAOCP and CLRS? When I read these sorts of posts, I generally think that the author really means, "I didn't understand these books, so therefore, they must be crap." I started reading TAOCP very early in my career, and Knuth's low-level approach of analysis has served me quite well over the years. It's why, for instance, I don't balk when I need to optimize a computationally heavy algorithm on an 8-bit microcontroller, or when I have to hand-tune a cryptographic algorithm to avoid cache side channel attacks, I can approach the problem logically. Knuth is an excellent teacher for the right sort of student. He's not for everyone, and there's nothing wrong with that. Software ceased being a one-size-fits-all field decades ago. For that reason, among many others, decrees of what is canon or not -- such as the one made by the author of this post -- are nothing but bunk.
He also doesn't seem to make the claim that the first list represents "canon" and that the second list represents "non-canon". The first list is a list of great books that are easily understood by most students. Many of the books on the second list are far more advanced.
I agree that some students can learn using any of these resources. However, most people would struggle with books like TAOCP.
He insinuates pretty strongly that the first set of books is the canonical set with this statement: "If you're interested enough in programming that you're reading this blog, you've probably read most, if not all of the books in this list, so I won't spend time reviewing each one individually."
While not being explicit, he's being pretty implicit regarding his judgement of value for these books.
I don't necessarily agree that TAOCP is especially difficult, but I did say in my original comment above that TAOCP is not for every student. For the right student, working on the right set of problems, I would not hesitate to recommend TAOCP.
He did note that it looks impressive on a shelf, but he also followed it up with "and it is impressive." Furthermore, he goes on to state that it is so comprehensive that if you have a computer science problem that falls in a category covered by one of the published volumes of TAOCP, and you cannot find a solution in TAOCP, that a solution likely does not exist. That's pretty high praise IMO.
>It's very dense and academic.
Is it not a dense text written in an academic style by an academic known for dense, academic writing? Its likely the greatest reference in existence, but it isn't an easy book to read at all. If it doesn't seem like a challenging read to you, then as the author already stated, you aren't one of the people that he was talking about.
>He insinuates pretty strongly that the first set of books is the canonical set with this statement: "If you're interested enough in programming that you're reading this blog, you've probably read most, if not all of the books in this list, so I won't spend time reviewing each one individually."
IMO, his statement doesn't attempt to canonize them at all. If you look at both lists, one list is obviously far more accessible to the average programmer than the other list. It doesn't make them better, just suitable for different people.
>For the right student, working on the right set of problems, I would not hesitate to recommend TAOCP.
I'm not saying that you're wrong. However, the article is about books that are often recommended to beginners. Most people will struggle if they start with TAOCP.
It's not. It's much more mathematical than your typical programming book and it might not be suitable for beginners (it says so itself) but that's not because it's 'dense and academic'. It comes with exercises, historical asides, nerdy jokes and so on.
You see some of these books suggested to novices. Is TAOCP really the best book for someone just getting started on their first program?
Isn't that just agreeing with your point here:
> Knuth is an excellent teacher for the right sort of student. He's not for everyone, and there's nothing wrong with that.
Donald Trump just thinks a lot of things.
The author never made a valid argument, but lot of people up-voted, an indicator of the growing friction between the plutocrats and the plebeians.
This is precisely the claim that the original article is talking about.
The article is not claiming that TAOCP and CLRS are bad books. The article simply argues that more people claim to have read the books than actually have read them.
Most of us have read "some" of TAOCP and CLRS.
These are fantastic but rigorous reference works. Generally you do not read reference books cover to cover
Personally I DO read reference books cover to cover on a regular basis. I understand not everyone does this, but I don't make judgements or assumptions of others based on this.
I'm have almost an obsessive need to read these things cover to cover. When I hit something I don't understand I reread it until I do or I go look up related materials until I understand it.
TAOCP is not a book it's 1/3 of a masters degree in paper form. Suggesting someone read it is like sugesting they get a MS useful, bit not quite the same as recommending a normal book.
I guess for the same reasons people want to bash the current interviewing institution in place.
Personally, I go with "it is what it is, deal with it if you want THAT job"
Back to the context, I think I find CLRS much more approachable than TAOCP. It relies on pseudo-code and fairly simple mathematical symbology to formalise the algorithms it is treating.
With TAOCP I have tried by can't get past the language of the imaginary CPU which it uses.
Also, there are other simply excellent texts on Algorithms which I personally found much more useful than TAOCP (in decreasing order of "easiness")
- Python Algorithms
- The Algorithm Design Manual
- Algorithms in Java (somehow I found CLRS easier to follow than this)
- And then I found this recently, "Algorithms on Strings,Trees and Sequences: Computer Science and Computational Biology" (one should have at least gone through the first two recommendations before starting with this one to make it easier to read this one)
Disclaimer : I have a growing interest in learning Algorithms due to the requirements of interviews, however I have come to believe that the knowledge will go a long way.
I bought Knuth's TAOCP and the Cormen book in high-school to prepare for programming competitions. I never read TAOCP and probably never will, it was simply too complex for what I needed.
CLRS was significantly easier to understand and I was able to read subchapters.
Today I use neither, but rather recommend Skiena, Sedgewick or other specialised books. Not because TAOCP and CLRS are bad, but because these other books are more approachable and also get the job done.
There are so many developers (like your peers at work) who have literally read none of these books. In a Darwinian sense, isn't that evidence that there's nothing so special about these texts for what you do?
Skip the books and just memorize what's on Interview Cake. I love that site. It so brilliantly subverts the academic pretensions of every industrial software development shop.
So many people will have memorized so precisely the solutions to programming test questions that they stop being useful. Engineering managers will be unable to come up with novel questions, because they will have no benchmarks. Then the madness will end: Computer Science will get less popular in university again, and this notion that these Gospels have All the Answers will go away.
Either they've never had to work on a really Hard Problem, or worse they have tried to solve such a problem without the hard-won lessons present in these books.
I've heard senior recruiters express skepticism about candidates who say they're into "Hard Problems." Experienced engineering managers usually hear that phrase when you're looking for a lifestyle and you don't really care what the application is.
For example, "I have no qualms that your business arbitrages nonconverting clicks from bots and sketchy web traffic (i.e., 99% of the Internet) into Google AdWords revenue, using up a Google customer's ad budget until the customer has barely made a profit... You see, I'm into Hard Problems. Besides, bots don't click twice! It must be real traffic!"
It turns out that junior people on the interview team care a lot more about Hard Problems and the stories in these books than senior people do. And you'd never want your hiring pipeline run by junior people.
In my opinion, Google's focus on Hard Problems has robbed academia of all the sincere Hard Problems people. Then, they've ruined the interview process for everyone else, ironically with the book How Google Works, convincing CEOs that they should be asking about C.S. fundamentals and Hard Problems totally unrelated to their Actual Problems.
That said, I happen to like problems in general... not necessarily "hard" problems, but just new problems. I also like working in new domains... I mean what other industry of work allows you to work in education, aerospace, banking, security and marketing in a decade and a half.
Arguably small operations need to be more stringent if they are deciding who will make up the next 25%-33% of their engineering workforce.
At small companies one bad hire can really poison the well.
I mean, have a trial period of a week or three... That will tell you far more than a gauntlet interview process with a lot of false negatives.
Before starting my own company I was a bar raiser at Amazon. Interviews were not supposed to last weeks (though many did and there were internal metrics aimed at driving interview to offer cycle time down).
I'd much rather spend a few hours per interviewer per candidate getting a solid candidate than hiring, onboarding, training, learning the candidate is a poor fit, and ultimately going through the not-fun-for-anyone firing process.
High turnover isn't worth the lost productivity from other employees or the hit to morale. On top of that, while learning whether or not the candidate will make it through the trial, you're burning precious cash.
Learn how to interview and teach your people how to interview. It will save you heartache and money.
Especially when many of those concepts necessitate code that is less than discoverable, well documented and easy to follow?
Also, there's the cost of existing staff in interviews... if you only have 2-3 devs, and you're keeping them all in interviews with multiple candidates, you're not getting work done.
Can you explain what this business model is and how the arbitrage works? Is this SEO gaming? Thanks
A large part of the job market has little need for those kind of books.
If you do so, it will formidably crystallize knowledge you've been acquiring subconsciously. You'll have probably been having gut feelings about why certain things are wrong or clunky and others are elegant, and Design Patterns will give you a vocabulary to describe these things.
If you read it as a beginner, the subtleties of it all will go over your head, and you'll end up trying to cram design patterns everywhere without much sense or reason.
> Trying to use all the patterns is a bad thing, because you will end up with synthetic designs—speculative designs that have flexibility that no one needs. These days software is too complex. We can't afford to speculate what else it should do. We need to really focus on what it needs. That's why I like refactoring to patterns. People should learn that when they have a particular kind of problem or code smell, as people call it these days, they can go to their patterns toolbox to find a solution.
From an interview with Erich Gamma ( http://www.artima.com/lejava/articles/gammadp.html )
Really? The Gang of Four book was my favorite programming book from the 1990s. A lot of people read GoF as a cookbook which I think devalues it and leads to a very dogmatic design approach. The message I got from applying GoF over a period of years is that there are existing, flexible patterns that you can use to organize large object oriented systems, though they don't solve nearly every problem and you'll need to tune them for each design. Viewed that way it's a work that can give you inspiration for decades.
I'd also add Facade and Strategy to your list.
It's indeed not long but it does save some time when talking about design and various potential solutions. That's good enough for me.
P.S: I think head first patterns should be preferred to the GoF book if one insists on learning from a book, but otherwise they can be grasped from a Wikipedia article.
The first is that it doesn't make enough of an effort to present "design patterns" as a conceptual framework. Can you program without using design patterns? No, you cannot. You are always using design patterns whether you know it or not, you're just using ad hoc design patterns without realizing it when you don't think you're using patterns. The book's failure to elucidate commonly used ad hoc patterns as well as common anti-patterns is one of its principle shortcomings. This is something that the refactoring book did much better, by providing examples of how things can be engineered in different ways and why one way might be desirable over another (and how to transition between them). Instead the GoF book largely presents a canonical list of some design patterns, giving people the false impression that these are special and unique versus merely a sampling of ways one might do things.
Additionally, the GoF book doesn't make it clear that each design pattern exists to work around specific constraints, and thus doesn't educate the reader when and why the design pattern should be used, and when not. The GoF list of design patterns is very much tightly coupled to the peculiarities of Java at the time the book was written. Many of the design patterns aren't necessary outside of that environment. For that reason, a lot of the design patterns in the book are either obsolete or so seamlessly built in to popular languages that it's not necessary to implement them as described in the book. Many of the more advanced "behavioral" patterns from the book are essentially ways to work around the fact that Java didn't have first class function, when that changes the design patterns that you end up with become very different.
A lot of the GoF's list of design patterns just aren't that useful anymore in modern languages, even though many of them are as applicable as ever (modern languages bake iterator support into the language, for example).
Additionally, the GoF design patterns break down into two categories: everyday patterns (like iterator, factory, proxy, or adapter) and sometimes patterns (like flyweight). This means that most of the patterns shouldn't be used in every project. Because the design patterns in the original book are somewhat out of date and because there's been so much cargo culting around the GoF patterns there's grown a substantial backlash against patterns. So people tend not to describe new patterns in such a way, even though they should. The concept itself is quite sound, but it requires diligent effort to ensure you're applying it well.
Quite to the contrary. I don't have my copy handy and haven't cracked it open in years, but I recall clearly that they presented design patterns as reusable solutions to commonly occurring problems. And for each pattern they outlined, they explained what the problem was that it was intended to solve.
Alas, the part about patterns being solutions to specific problems was lost on many developers, who shoveled in applications of patterns where their corresponding problem didn't exist. And rather than a more modular, understandable, maintainable codebase, an inscrutable mess was often the result.
The book is about C++ and Smalltalk. There is not a single mention of Java in their book.
GoF publication date: 1994;
Java launch date: 1995
The patterns are essentially Smalltalk-based, though the authors had been implementing the same ideas in C++ - Gamma in ET++ and Vlissides in InterViews.
I am not sure I can agree with the claim that everything written is following some design pattern. Perhaps it is an known anti-pattern instead or completely ad-hoc solution that is neither but is nonetheless a good design.
Design patterns are architectures that have been distilled from the field experience to be useful to solve certain problems resulting in a good design. Therefore it is also quite natural that they get integrated into language designs over the time.
> Instead the GoF book largely presents a canonical list
> of some design patterns, giving people the false
> impression that these are special and unique versus
> merely a sampling of ways one might do things.
"Despite the book's size, the design patterns in it capture only a fraction of what an expert might know. It doesn't have any patterns dealing with concurrency or distributed programming or real-time programming. It doesn't have any application domain-specific patterns. It doesn't tell you how to build user interfaces, how to write device drivers, or how to use an object-oriented database. Each of these areas has its own patterns, and it would be worthwhile for someone to catalog those too."
> Additionally, the GoF book doesn't make it clear that
> each design pattern exists to work around specific
> constraints, and thus doesn't educate the reader when
> and why the design pattern should be used, and when not.
Literally every single pattern chapter in the book comes with an explicit section describing when you should and should not use it.
> The GoF list of design patterns is very much tightly
> coupled to the peculiarities of Java at the time the
> book was written. Many of the design patterns aren't
> necessary outside of that environment.
> Many of the more advanced "behavioral" patterns from
> the book are essentially ways to work around the fact
> that Java didn't have first class function, when that
> changes the design patterns that you end up with become
> very different.
The Command pattern isn't just "how to take first-class functions in a language that doesn't have them". It also talks about commands that are undoable, or that can be logged, or serialized and deserialized. It's as much about the objects that create the command, receive the command, and have the command invoked upon them as it is the command itself.
I do wish it talked more about the relationship between commands and first-class functions (which I do here), but it's not as bad as you make it out to be.
If i'd gotten a C++ job instead, i'd probably have torn up the C++ programming language book. My copy is fairly worn, but not like that gof book.
CLRS probably isn't as helpful as it once was, because the vast majority of time you just use an ADT from whatever library your language provides. Sure, maybe go review some subtleties from time to time, but if you're writing an algorithm for work, you probably do that all the time, CLRS isn't going to be that insightful.
The dragon book was opaque to me. Just about every thing in that book, i had to go find another discussion of. I'm just not smart enough to parse those words and math into code.
I dunno. I'm a pretty mediocre programmer. They've been helpful to me at various points in my career, but i don't think they are much help in modern programming.
* I've only been able to chew through a couple chapters of TAOCP. I think the only programmers these books help are people like Fabrice Bellard. Find someone who's worked through those books, hire them. Well, if you really need to solve problems and you're not just hooking a submit button to mysql.
Agreed. Code Complete gave me a huge boost as a liberal arts refugee in computing. Having given away my copy, I bought the second edition, thinking it would be worth a reread. Not so much. Pragmatic Programmer didn't do much for me either, probably because I read it too late.
On the flip side, plodding through TAOCP just because? Better to read the relevant chapters when you've got an actual project to chew on.
One question that keeps popping into my mind is what books and in what order should one feed an inexperienced junior to turn them into a mature software engineer.
For example, my biggest take away from Code Complete was the stupid slow but easy to test implementation of the excel calculation engine. I was working on a print system at the time, building a stupid system and a fancy system and comparing the results gave me a lot of insight about what was wrong with the fancy system.
Me personally, i got a ton out of the lisp books PAIP, SICP, On Lisp, and even the CL spec. Thinking about something in terms of stream processing or unification from the various prolog implementations would get me started on answers to problems i was facing (they never required unification).
Just blindly handing out reading, i'd probably give people _The Phoenix Project_. Most organizations are a mess, and new programmers don't understand that. The down side is, new programmers can't do much about it. It can help explain why you do some painful stuff though, like on call or code reviews.
That is some gross arrogance on full display right there. If you're as good as me, surely you've read all these books. Any other books... flawed in ways you should be ashamed of.
The shard language. The GoF wanted that shared language, but people missed the ideal behind it and thought it was a cookbook instead.
The GoF draws from A Pattern Language ( https://www.amazon.com/dp/0195019199/ ) and I believe does a better job at communicating what Patterns should be - a common language and broad brush strokes of solving common problems (though almost always require some tweaking to make it fit just right).
Also, he gives "more of a language reference" as a reason for not reading Stroutstrup's C++ book. Doesn't the same apply to K&R?
It's probably impossible for any book to fully describe C++ because it'a a perpetually moving target on a three year update cycle.
There are always new idioms and techniques to learn, but they keep being changed/added without any obvious coherent design strategy or logical goal.
Books can barely keep up. Stroustrop's own C++ introduction is for C++11, which is already five years and nearly two releases old.
The end of the book even included a grammar for C++ (with a slight comment about how the grammar wasn't exact because of C++ context sensitivity)
In 2016, however, I think that a lot of people would have read CLRS, even if they haven't read ALL of it. It's used as the reference text by a lot of university courses on algorithms.
I'd add that EE20N used a Berkeley written text, Structure and Interpretation of Systems and Signals. I was in the minority and liked that book. I still have it.
There's just something special in the way material is presented in SICP, in the order of topics, in the text's tone: it remains enthralling without being difficult, it teaches without being indulgent. It absolutely is meant to be read.
A "stronger" example of this tone would be The Little Schemer series.
Ironically, Chapter 1's reliance on math for examples and exercises turns people off!
I agree that, ultimately, not everything has to be proven correct. Often there isn't even a well defined criterion for software correctness. But proofs absolutely matter in certain domains (algorithms, databases, compilers, etc.), and there's no way you can be ready to prove things about moderately tricky programs if you don't have practice proving things about toy ones.
I think the author's next read should be a good statistics book.
Here's my anecdotal evidence. I have read 8 out of 10 from his first list. I have read everything except for TAOCP from the second list. Some of the ones from the second list I have read multiple times and I still go back too as reference from time to time.
I have started reading TAOCP on several occasions, but I'm not going to lie; that shit is complicated and the dry academic nature makes it even worse to parse. I still plan to get through it in my lifetime, but I keep prioritizing easier reading.
I don't think that's quite true. The only C++ compilers I know of are:
1. cfront C++ (from which many C++ compilers were derived)
2. Digital Mars C++ (the one I wrote)
5. EDG C++ (from which many other C++ compilers are derived)
6. Microsoft C++
7. Taumetric C++
It's a pretty thin list for 30 years. (I apologize if I neglected any.) Contrast that with C - at one time in the 1980s I counted over 30 of them (I'm not sure how many were independent creations).
Edit: removed Borland C++ as that was derived from Taumetric.
Ironically, one of the hardest things about a C compiler is the preprocessor. You'd think a text macro processor would be simple, and it should be simple. There's an amazing amount of subtle complexity in just the handful of pages describing it in the Standard.
I and many others implemented a C compiler based on K+R, and I can attest that a lot of rework can be avoided if following a Standard instead (C89 did not exist at the time).
I'm pretty sure it's the
hardest language to implement
Even scalac and ghc are pretty complex beasts in order to Scala and Haskell reasonably speedy.
Of course, I should have said "front end". Working on the optimizer/codegen is an endless process.
Meta-tracing a la PyPy is such a revolution because it enables low-cost creation of reasonably performant tracing JIT compilers.
The book is for the people wanting to know every corner of the language. Unfortunatley I read it a long time ago and don't remember if it's enough for that purpose or one still needs Meyers and maybe some other resources.
Today, yes. But in the olden days of C++, there was no standard and people used Bjarne's book.
As for the design patterns book, I've glanced through it and didn't find it all that useful; that being said, I was never a fan of that school of thought (OOP, UML, patterns, etc.) either.
Maybe it's the influx of new people to the site that upvote these articles?
> it's intended for the multitudes who are trying to appear smarter by pretending to have read them.
As opposed to trying to appear smarter by issuing decrees to the internet about what everyone who is "interested in programming" has read.
When I started coding professionally you had to buy books because the information wasn't anywhere else, but today even for the more difficult subjects you can find papers and blog posts and wiki articles that cover what you want to know at different depths and from different angles.
Many people learn a lot of their development knowledge from the web (and I certainly do as well), but the majority of the web sources for material beyond basic language reference simply extract the easiest to digest parts of the major texts. This is why so much of what many people know about patterns, algorithms, data structures, and even OOP, TDD, Agile, Scrum, etc. is fragmented or even wrong.
From the second: (1) Intro to Algorithms, absolutely cover to cover once, and selected readings additional times; based the Kazlib "dict" module closely on the Red Black tree algorithm (with a slight improvement); (2) Compilers: Principles, Techniques and Tools: Ditto, and in recent years, implemented a regex compiler based on the graph-based descriptions in that book; (3) TAOCP: selected readings only, too huge; (4) Design Patterns: cover to cover, when OOP was fashionable in the 90's; (5) nope, but I read the Stroustrup and Ellis Annotated C++ Reference Manual cover to cover (weekend before starting first C++ gig). That counts as a 1:1 substitute, more or less.
The books on the second list which I have read (at least partially) are books I have had reason to go back to at some point in my career, and any of them which I no longer have, I will likely purchase again.
Reminds me of the Caltech required 3rd year math course "AMa95 Introductory Methods in Applied Mathematics", that came with the dry comment that "Introductory does not mean elementary." It had a reputation as one of the toughest classes.
That's almost certainly wrong. TAOCP is very outdated in many ways and doesn't really cover modern algorithms.
It's great for classic CS upto 90ies(?) or so, but there has
been a lot of progress since then.
For hashing, he does cover things at arms length.
It's also true for other areas. Let's take sorting. The best general purpose comparing algorithm is timsort. timsort came out in 2002.
Keep in mind TAOCP was originally published in the early 70ies (first book came out in 68) and even though there were some new editions (upto late 90ies) they didn't really change all that much.
You can just scan over TAOCP's bibliography and see the average year of the cites. It is usually 60ies.
Claiming that CS has not improved since the 60ies is fairly absurd.
Still, your point is fair. The section on secondary key retrieval is very high level. Bloom filters get just two paragraphs.
I think the biggest change in cs is the view towards memory. In much of TAOCP, care is given to the memory footprint. Much of day to day programming is completely ignorant of memory. Scarily so.
Never come across Code, Head first Design Patterns - WHY? GoF book is much better, and well written and explained. Read all the rest, though K&R 1st edition. Bought 2nd when someone 'borrowed' original from my desk and never returned.
Of the rest:
Never heard of the first one, but I'm British.
My green dragon book was read cover to cover a couple of times, and I tried to build a compiler off the back of it. I used another compiler book in equal amount, I forget the book though.
I've three volumes of Knuth. I should have bought one at a time - I got about 1/2 way through the first volume.
Design patterns was extremely well used.
C++ was mainly reference, unlike K&R it wasn't a good read to learn from. Pretty heavy going if I remember. The Design and Evolution of C++ was much more interestng. Then the 2 Scott Meyers books (Effective C++ 1&2). Thinking in C++ was the book recommended to learn from at the time, though I've not read this.
I think what's more likely is, someone will say, "YOU should read this book. It's awesome." The implication being, the suggestion comes from first-hand experience.
I spent maybe 3 years as a kid flipping through the C++ Programming Language book, trying to figure out how to do stuff with that + the Allegro tutorial. Totally a programming language reference.
It's one of those weird books that you could spend years just looking at on a desert island and find things to do with it. But you basically need to flip through all of it to get what is happening most of the time (at least I needed to when I was a kid).
Really wish I knew about Python back then....
Knuth's writing is clear and amazing. It really makes the history of computer interesting.
Pretty typical of the author of the post who did not complete TAOCP to say it is somehow flawed for being too cryptic.
Why is the author concerned about what others have read or not anyways? It seems he more concerned about proving he has read more than others.
I've been programming for a few years now and I'm looking to more formally educate myself in general computer science. I've been told having a good grasp on compilers and interpreters can be useful for a wide variety of problems.
So far "Language Implementation Patterns" looks like it might be a promising alternative.
I also rather enjoyed Crafting a Compiler. I found the explanations and excercises to be much more practical and enjoyable than the dragon book, but the code authors present is mostly garbage and full of global variables, state mutations and design pattern heavy. Ignore the code and do the excercises yourself.
I always found the Dragon book to be too parsing-heavy which gets boring quickly. The books I listed spend far less time on parsing and much more on code generation and optimization which I find more enjoyable. YMMV.
It pretty much just glosses over parsing etc., and focuses on the semantics of a language, which IMHO is much more interesting.
Sure, if you're building a real language implementation, you will need a parser, but in a lot of ways it's the least interesting part.
Instead I knew about TAOCP, also because of TeX. Not a required read in my CS course. We had a smaller book, Wirth's Algorithms + Data = Programs. I skimmed through the TAOCP volumes available online but it takes really a lot of dedication to read substantial part of those books and I think that the impact on my work would be minimal nowadays.
But yes, by the look of them both those books will be relevant for a while.
It is also an easy read - its not a hard academic book.
Much of Maquire's coverage is VERY C++ centered, so people using anything else are going to skip a bunch. It's a short book, 250 or so pages, and an easy read.
The second list should also include Structure and Interpretation of Computer Programs.
Most of the books on the first list were published after veterans already learned the relevant materials -- the author must be under 30 (so under 25 at the time).
Certainly a weird sample among popular books, the author thinks everyone reads the same arbitrary subset of books he had.
Wow, amazing insight!