
Basics of Compiler Design (2000) [pdf] - mpiedrav
http://hjemmesider.diku.dk/~torbenm/Basics/basics_lulu2.pdf
======
ernst_klim
_sigh_

Yet another book which spends more than 1/3 of its pages on parsing.

Syntax doesn't matter (much), semantics matters. You need to know how to
implement efficiently various PL stuff within your compiler: exceptions and
algebraic effects, modules and parametric modules, parametric polymorphism and
optimizations for it in presence of modularity, fibers, method dispatching in
Object Oriented langs, type inference etc etc.

These books are about parsing, not about compilers. They spend a lot of time
explaining how to parse a simple featureless language, instead of just use a
parser generator and focus on actual programming languages design and
features.

Appel's Modern compiler in ML/Java/C is way better. Also there is a great
course from the creator of Chez Scheme (I bet I've seen the whole course
available in the internet, but I couldn't find it anymore)

[https://www.cs.princeton.edu/~appel/modern/](https://www.cs.princeton.edu/~appel/modern/)

[http://composition.al/blog/2017/07/31/my-first-fifteen-
compi...](http://composition.al/blog/2017/07/31/my-first-fifteen-compilers/)

~~~
Athas
I don't think it matters that a third of the pages are spent on parsing - that
only covers 20% of the chapters. It's just that parsing takes somewhat more
space to explain because the figures, tables, and examples take up relatively
more space than when explaining e.g. type checking. I think it is a
superficial knee-jerk reaction to classify this book as mostly being about
parsing.

Also, while it is true that production compilers (especially optimising ones)
have relatively little effort spent on their parsers compared to everything
else, if you consider a _bare minimum_ compiler, parsing is going to be a
proportionally much larger part of it - especially if you do not use a
prewritten parser generator (which this book assumes you do not, since it's
partly about how you would _create_ such a generator).

I definitely agree that too many compiler resources place too much emphasis on
parsing, but I don't think this specific book is a particularly egregious
case. I have also read Appel's books, and while they are good, I think you
need to have read something else first to cover the basics.

~~~
ernst_klim
> bare minimum compiler

But who needs books on that? Especially considering that if I would like to
write a simple toyish compiler I would rather use a parser generator
(handcrafting a parser makes sense only in the case of serious industrial
grade compiler).

These books are books on parsers, not books on compilers. IMHO for learning a
compiler craft it's much better to use a parser generator (or even Lisp) to
skip the syntax part and move to something that constitutes an actual
compiler: runtime interaction, exceptions mechanisms, code generation, type
inference, intermediate languages, effect tracking etc etc.

~~~
Athas
This is a textbook for a university course. The intended audience for this
book needs a bare minimum compiler, because that is all they will have time to
write.

You can't just say "use a parser generator". Who is supposed to write the
parser generator, then? This book is for people who want to learn how to do
all that foundational stuff, not an industrial programmer who needs to build a
good product quickly.

~~~
ernst_klim
>You can't just say "use a parser generator"

You can just give students grammar for parser generator, and a parsetree/AST.
Or use Lisp (Common Lisp or Scheme) and avoid syntax whatsoever.

------
dragontamer
Most code converts an array of items into another array of items (Ex: Sorting
converts an arbitrary array into a sorted array).

If you're a more complicated fellow, you'll convert a tree of items into
another tree of items. For example, collision detection in video games is
often done with B-Trees, Raytracing with BVH trees that represent all the
vertices on the screen.

The final step... the end-all be-all of complexity... is the arbitrary graph.
That is: code that walks arbitrary graphs, and converts them into other
arbitrary graphs. After all, most code have cycles, so you can't even make a
DAG-assumption like in maximum flow.

That's about it. We may call it "dead code elimination", or "common
subexpression elimination". But at the end of the day, all you're doing is
running graph-analysis, and then converting that graph into a "more efficient"
form.

~~~
haecceity
Interesting perspective! Even graph analysis could be reduced to manipulating
infinite tape in the end.

------
Athas
I was taught from, and have since taught with, this book. I like that it's so
concise, where I think e.g. the Dragon Book contains way too much irrelevant
information (and also dubious and obsolete implementation advice, like global
symbol tables). Of the books I have read, I think this one strikes the best
balance between breadth and depth when it comes to the theory of compiler
design. It gives you sufficient information about how to handle every part of
a compiler, but doesn't necessarily show you a lot of options in each area.

The only weakness is that it takes a mature programmer to go from the high-
level descriptions and pseudocode in the book, to a concrete implementation.
It helps if you write in a functional language, since the pseudocode is
essentially Standard ML. I think this book might be well served by a small
companion guide on how to practically implement the designs and algorithms it
covers.

~~~
bmn__
The book's chapter 3 does not describe the current state of the art. I find it
is useful only as a historic background up to a certain point in time (1969?).

However, from a practical standpoint, if one wants to want to implement a
parser (for the purpose of a compiler) or attempt to use the book in order to
try to save time deciding what is a good parser that does not suffer from
algorithmic shortcomings, one is advised to look at more modern algorithms.
Notably, anything with LR in its name, anything called PEG or packrat can be
outright avoided if one values one's time.

~~~
mamcx
???

Then which one? Pratt and top down by hand?

------
chrisaycock
As someone currently building a language, books like this have been critical.
_Basics of Compiler Design_ covers a lot of the common ground of compiler
construction from a more theoretical standpoint.

Other recent books I've found really helpful include:

 _Crafting Interpreters_ by Bob Nystrom (@munificent on HN)

[https://www.craftinginterpreters.com](https://www.craftinginterpreters.com)

 _Language Implementation Patterns_ by Terence Parr (creator of ANTLR)

[https://pragprog.com/book/tpdsl/language-implementation-
patt...](https://pragprog.com/book/tpdsl/language-implementation-patterns)

------
dang
A thread from 2011:
[https://news.ycombinator.com/item?id=2474175](https://news.ycombinator.com/item?id=2474175)

2009:
[https://news.ycombinator.com/item?id=602188](https://news.ycombinator.com/item?id=602188)

------
lewisjoe
Folks interested in compiler design: here's a list of resources I put together
for building compilers - [http://hexopress.com/@joe/blog/2019/04/14/awesome-
compiler-r...](http://hexopress.com/@joe/blog/2019/04/14/awesome-compiler-
resources/)

------
hvidgaard
Shouldn't the title be 2010 since the edition linked was published in 2010?

------
hhjinks
What's the go-to book on compilers in this day and age?

------
LessDmesg
Saved! Nice book.

