
How These Things Work – A book about CS from first principles (2016) - falava
http://reasonablypolymorphic.com/book/preface
======
Toenex
Nice. I did an EE degree many years ago even though I wanted to work in
software. The single best course I did was to design, simulate and have
fabricated a 4 bit microprocessor. It completely solidified my understanding
of how computations took place. Hopefully, texts like this will do the same
for others.

~~~
JoeSmithson
In a similar vein, I highly recommend the MOOC "nand2tetris" which progresses
smoothly from simple logic circuits right up to a high level Java-like
programming language.

------
ceronman
I love this kind of Books! In a much more general tone, there is "The
Knowledge: How To Rebuild Our World After An Apocalypse" by Lewis Dartnell.
This explains the first principles of a lot of things we take for granted in
the modern world, from agriculture to food and clothing, medicine, chemistry
and more.

This book about CS principles is a great complement to that!

~~~
kragen
As I posted the last time this book came up
[https://news.ycombinator.com/item?id=22294655](https://news.ycombinator.com/item?id=22294655)
that book would be more accurately called _The Misinformation_. If you follow
the instructions in it you will die. Better alternatives are listed in my
linked comment.

~~~
jwhitlark
I agree. I was deeply disappointed by "The Knowledge..."

------
matco11
One of my favorite books on subject is The Elements of Computing Systems:
Building a Modern Computer from First Principles by Noam Nisan
[https://www.amazon.com/Elements-Computing-Systems-
Building-P...](https://www.amazon.com/Elements-Computing-Systems-Building-
Principles/dp/0262640686)

~~~
djangovm
Would you say that this book is still relevant, now that it is 15 years old?

~~~
dang
It's wonderfully relevant because it takes you through the basics and those
are all still the same. Their simulator software was already old when I went
through the book a few years ago, but I'm sure it still works fine.

------
grumple
I recommend Code: The Hidden Language of Computer Hardware and Software by
Charles Petzold [1]. It is far more comprehensive than the OP, goes from pre-
computer code, to electrical circuits, to an overview of assembly. No prior
knowledge needed except how to read.

1\. [https://www.amazon.com/Code-Language-Computer-Hardware-
Softw...](https://www.amazon.com/Code-Language-Computer-Hardware-
Software/dp/0735611319)

~~~
kragen
_Code_ 's good but it doesn't cover Kleisli categories and Kleisli
composition, Peano arithmetic, parametric polymorphism, sum types, pattern
matching, or any of numerous other things covered in Maguire's _How These
Things Work_. So it's not accurate to say _Code_ is "far more comprehensive";
_Code_ mentions FORTRAN, ALGOL, COBOL, PL/1, and BASIC, but the example
programs it contains are written in assembly, ALGOL, and BASIC. It doesn't
contain any programs you can actually run except for an three-line BASIC
program and some even simpler assembly programs.

~~~
tenaciousDaniel
It's very depressing that I've been a developer for 8 years and I've never
heard any of those terms you mentioned. I'm self taught but I've always felt
like I should go back and really learn the fundamentals.

~~~
eat
I'm not saying you shouldn't always strive to learn new things (for your own
personal growth and curiosity), but I think it's important to point out that
the link between being a developer and knowing about these things-- esoteric
topics of applied Mathematics-- is pretty weak.

Imagine a carpenter spending their time getting a chemistry degree in order to
better understand how wood glue works.

~~~
murgindrag
I don't think so. Understanding what goes on underneath the hood is really
what differentiates decent coders from great engineers. Compare the data
structures of Subversion to those of git. Or look at some of the work John
Carmack did in video games. That requires depth.

If your goal is to be a carpenter who puts together housing frames, you
absolutely don't need depth. You're also interchangeable and get paid union
blue collar wages. On the other hand, if you want to be a craftsman who
invents new wooden things, you need depth in some direction, be that
structural engineering, artistic, or otherwise.

There's a ceiling you hit unless you learn much more of this stuff. The
direction is your choice (but new APIs ain't it -- we're talking depth).

~~~
elbear
Not everyone needs to be great.

What I actually want to say is that OP shouldn't feel guilty about not knowing
those things. It's okay to want to master these things, if it's what you want.
But it's pointless to feel bad about not knowing them.

~~~
kragen
Of course there is no necessity for excellence. The only necessary thing about
human life is death; everything else is optional. Before your death, you can
cultivate excellence in yourself, or not — many people instead cultivate
hatred, addiction, or greed. There are many ways to cultivate excellence;
learning is only one of them, and there are many things to learn. Mathematics,
and in particular logic (which is what we are talking about here) are the
foundation of all objective knowledge, but objective knowledge is not the only
kind that has value.

The true philosopher, motivated by love for the truth rather than pride, is so
noble in spirit that when she sees evidence that she may be in error, she
immediately investigates it rather than turning away; and if she discovers
that the evidence is valid, she immediately changes her position. I see such
nobility so routinely among mathematicians and logicians that it is noteworthy
in the rare cases where it is absent. I see it rarely outside of that field;
in some fields, like psychology and theology, I do not see it at all. So I
conclude — tentatively — that excellence in mathematics and logic promotes
humility and nobility of spirit, which is the highest and most praiseworthy
kind of excellence.

So, while I do not think the OP should feel guilty about not knowing those
things, I also do not agree with the implication that there is nothing
praiseworthy about knowing them.

~~~
elbear
Well, I agree with you. I think that pursuing our interests in mathematics,
music, literature or whatever strikes our fancy is admirable. And I think it
makes us happier, wiser and more humble as you say.

At the same time, I maintain that we shouldn't feel guilty if we aren't doing
it that, for whatever reason. Sure, sometimes we actually want to pursuit some
of these things, but don't. Maybe it's because we have a messy schedule, we
can't organize ourselves to prioritize passions.

Feeling guilty does little to actually make you pursue your passions. You're
better off learning about habits and how to pick ones that serve you.

~~~
kragen
Agreed, except that I don't think pursuing our interests in _whatever strikes
our fancy_ is admirable.

~~~
elbear
As long as it's not hurting someone, anything goes as far as I'm concerned.

------
oblosys
For a do-it-yourself version of chapter 1 (about building complex circuits
using only nands), I can recommend
[http://nandgame.com/](http://nandgame.com/)

------
porknubbins
_A few years ago, I asked my engineer friend about how much of civilization he
could rebuild singlehandedly, should he survive some hypothetical apocalyptic
event. “All of it,” he replied. “Not all at once, but I know enough to be able
to puzzle together the pieces I don’t know right this second.”_

While I admire the Connecticut-Yankee optimism of the engineer, as a non
engineer I am seriously skeptical about how a single engineer could know
enough about the chemistry, materials, physics, CS etc. I can explain what a
battery, or transistor is supposed to do but wouldn't have the foggiest idea
how to actually make one. In this scenario are we leaving the bunker to break
into Bell Labs (or some research university library at least)?

~~~
latexr
I share your skepticism. Seems to me the engineer is falling prey to the
Dunning-Kruger effect[1]. Rather than knowing enough to be able to puzzle
together all the pieces they don’t know, I’d wager they don’t know enough to
be able to discern what they won’t be able to figure out.

[1]:
[https://en.wikipedia.org/wiki/Dunning%E2%80%93Kruger_effect](https://en.wikipedia.org/wiki/Dunning%E2%80%93Kruger_effect)

~~~
naasking
We studied a lot of detailed semiconductor physics in my engineering degree.
Certainly loads of details were left out, but knowing something is possible
and roughly along which lines it could be achieved is a huge hurdle in the
innovation process.

------
hcarvalhoalves
Half-way into it’s introducing monads and Maybe. Feels like teaching a stack
machine after talking about the visitor pattern. There’s good information here
but I’m not sure it covers the important fundamentals (such that I could give
to a beginner).

------
Anon84
This looks really cool, thank you.

Another book I particularly like in the same style are Feynman (lesser known)
lectures on computation: [https://amzn.to/2SSoJaR](https://amzn.to/2SSoJaR)
where he takes you from single instructions all the way to quantum computing

------
kragen
Neat, and it looks like it's under a 3-clause BSD license, too:
[https://github.com/isovector/reasonablypolymorphic.com/blob/...](https://github.com/isovector/reasonablypolymorphic.com/blob/master/LICENSE)

And it's tackling pretty advanced material — a bunch of category-theory stuff
that I have no idea about. This is exciting!

It looks like maybe it's unfinished:
[https://reasonablypolymorphic.com/book/tying-it-all-
together...](https://reasonablypolymorphic.com/book/tying-it-all-
together.html) ends, "Really, we’re just getting started," and then (the
current version of) the book ends. What a cliffhanger ending!

It doesn't seem to yet cover circuitry; the hardware it discusses seems to be
a two-tape Turing machine, much like BF. The author seems to have been
simulating the machine by hand to generate the included execution traces.

I had a hard time finding the source at first:
[https://github.com/isovector/reasonablypolymorphic.com/blob/...](https://github.com/isovector/reasonablypolymorphic.com/blob/master/site/static/book/machine-
diagrams.html) has a bunch of attribute-embedded &-escaped SVG (including
XMLPIs!) that he almost certainly didn't type like that. That file is
duplicated at
[https://github.com/isovector/reasonablypolymorphic.com/blob/...](https://github.com/isovector/reasonablypolymorphic.com/blob/master/docs/book/machine-
diagrams.html) in the same format.

As it turns out, the source for that post is in
[https://github.com/isovector/reasonablypolymorphic.com/blob/...](https://github.com/isovector/reasonablypolymorphic.com/blob/master/site/httw/2016-11-01-machine-
diagrams.markdown), with embedded Haskell to produce the SVGs. The build
scripts looks like they might be in
[https://github.com/isovector/reasonablypolymorphic.com/tree/...](https://github.com/isovector/reasonablypolymorphic.com/tree/master/src)
and
[https://github.com/isovector/reasonablypolymorphic.com/tree/...](https://github.com/isovector/reasonablypolymorphic.com/tree/master/scripts)
but I can't tell where the code for generating SVG comes from. ("stack
install" maybe? But then is it datetime, sitepipe, or strptime?) So I can't
figure out how to fix the text in the SVGs to not crash into the diagram
lines.

Careful about cloning the repo. It's a quarter gig!

~~~
javajosh
The author states that computer science has little to do with computers on the
first page so I would not expect circuitry.

~~~
kragen
I don't disagree with him on that, but there's really quite a bit of stuff in
there about quasi-circuitry-like things: "machines" and "latches" and things
like

> _In the next chapter, we 'll investigate how to make machines that _change
> over time _, which will be the basis for us to_ store information _. From
> there it 's just a short hop to actual computers!_

so I think that grounding abstract computation in something that can clearly
be constructed in real life is actually very much a concern of the book, even
if he's not planning to cover IEEE-754 denormals, the cost of TLB flushes, or
strategies for reducing the bit error rate due to metastability when crossing
clock domains.

------
koonsolo
I never heard of "first principles" until Elon Musk used it. (Not a native
English speaker) Now I see it everywhere.

Must be the new overhyped term. "We start from first principles, just like
Elon Musk".

After looking at Google trends, I might be wrong, so nevermind ;)
[https://trends.google.com/trends/explore?date=all&q=First%20...](https://trends.google.com/trends/explore?date=all&q=First%20principles)

~~~
KineticLensman
UK here - I remember it back in college thirty+ years ago as in "build X from
first principles", which could mean (e.g.) implement an algorithm without out
using library calls.

