
Algorithms, by Jeff Erickson - bdr
http://jeffe.cs.illinois.edu/teaching/algorithms/?
======
primitivesuave
Jeff Erickson was my algorithms professor in 2012. He exemplifies the
articulate, passionate educator that I wish I had for my other CS subjects. I
recognize many of these notes having read them many times in preparation for
quite difficult exams - a fun anecdote shared among people who've taken the
class is the 25% credit given on any exam question just for writing "I don't
know", effectively a reward for acknowledging your own shortcoming and for
saving the TA the time to decipher a bullshit answer.

Professor Erickson, if you're reading this, thank you for being the best
educator of my college days and for making your beautifully-written notes
available to everyone.

~~~
gricardo99
>25% credit given on any exam question just for writing "I don't know",
effectively a reward for acknowledging your own shortcoming and for saving the
TA the time to decipher a bullshit answer.

That’s brilliant, yet I’ve never heard of it. Should be standard scoring for
written exams.

~~~
Vaslo
Random other point of brilliance I've seen: Our Organic Chem teacher (who was
loved universally in the Program) had a rule about test corrections. If you
wanted a correction to something you believed you should get credit on, he
would only offer to regrade your WHOLE test, which meant you could actually
get less points on the regrade because it was he and not a TA regrading (could
have worked both ways). It really scared off all those one-off "Can I get an
extra point here" requests in a 300 person class.

~~~
SilasX
I’ve never understood why that’s a good policy. You’re (vindictively)
punishing the student for pointing out — albeit selectively — where the
teacher, supposedly a bedrock of truth on the course material, has falsely
labeled your “training data”.

~~~
nichochar
I think you're running under the assumption that all students making the
request are doing it for the right reasons.

I could be wrong, but my intuition goes the other way and I'm assuming most
students do it to get a better grade, not learn more.

~~~
solveit
Yeah no, you don't get to say students should care more about learning than
grades when jobs, opportunities, and _scholarships given by the very
university that claims learning is more important_ are riding on grades.

It is the professor's responsibility to grade accurately, and honestly if
there is so much inconsistency in grading that there's a significant
probability that the student will lose marks despite having pointed out a
reason they should gain marks, the professor is screwing up very badly[1].
Don't pin that on the student not having noble intentions or whatever.

[1]:Or at least that's the case in STEM. My humanities friends tell me that
grading is much more subjective there which is a bit disturbing but I'm hardly
qualified to make such judgments.

~~~
andrewflnr
As long as we're getting all realpolitik about it, the reason is that lots of
students are assholes who will run roughshod over any professor who shows
weakness. Source: was a student, watched it happen routinely.

Meanwhile, in a less politically charged situation, it's pretty normal, or at
least smart, to go back over any area where you messed up with a fine-toothed
comb.

If, and this is a rather pessimistic assumption, if the student is likely to
lower their grade on a re-test, in my experience it's more likely to be
because the original grader was being generous in the face of ambiguity,
rather than systemic random errors. That's because the graders are mostly nice
people. If that's not good enough for someone, they damn well better have a
good reason for it. I have no problem with policy that enforces this. The
politics are a distraction from the ethical question.

~~~
SilasX
That policy would make sense, but that involves major caveats that aren't
clear from the original description. That works because it's specifically
written with "generosity buffer" that (by design) only benefits your grade,
and thus doesn't indicate the teacher was falsely labeling the answers as
correct when they weren't.

That's not the same thing as the original, which implied "my grading is so
random, hope you're lucky on the rescore".

Edit: I had a high school teacher with a policy where she’d add X points to
every score from the get go, and you’dlose those if you objected to grading,
so you’d only object for misgrades by more than X. But that’s not the same as
the original “okay, let’s roll my truth-recognizer dice again!”

------
_underflow_
> Please do not ask me for solutions to the exercises. Even if you are [an]
> instructor, I will say no.

That's kind of a bummer. I like to be able to check my answers when teaching
myself things.

Am I somehow alone in that?

~~~
jefferickson
Hi, I'm the author.

I'm honestly seriously torn about this. There is a serious tension between
pedagogical needs of students in formal classrooms and the pedagogical needs
of self-learners. I've chosen to aim for the former. Yes, I know it's a
bummer.

(From experience) providing solutions interferes with the learning process of
my own students at Illinois. I have to change up homeworks and exam questions
every semester, because otherwise students will look up and copy/memorize the
answers instead of trying to figuring them out, which means they do worse on
the exams where they HAVE to figure things out.

Also, a significant majority of the requests I get for solutions are from
university students who want to cheat.

I _do_ provide solutions to the homework and exam and lab problems I assign in
any particular class, after the fact. (And I release those solutions publicly,
despite the protests of colleagues at other universities, until the semester
ends.) And I have started proving model solutions in the homeworks, so that
the students know what _kind_ of answers I'm looking for. (See the course
materials page, under CS 374.)

In practice, my stance is becoming increasingly moot, as more of my official
solutions get uploaded to places like CourseHero and Koofers and their many
foreign-language equivalents.

I am very likely to include solutions for a subset of problems (maybe three or
four per chapter) in a future edition. But that will take a significant amount
of time, and I wanted to get something out the door.

~~~
aklemm
You could consider providing answers and feedback in a premium priced forum or
email subscription.

~~~
jefferickson
Ew. Ew ew ew. No.

Wait, did I say that aloud? Sorry, I meant "I'm already busy enough, thanks."

------
alsadi
The logo on the right is the arabic letters of the word Algorithmi (al-
Khwarizmi) the persian muslim scholar after him algorithms and logarithms were
named

~~~
dhimes
Logo on the right of what? All I see is a geometric design in the middle of
the cover. It doesn't look Arabic and it seems like it would be very
cumbersome for someone to sign as a name.

~~~
alsadi
it's a known calligraphy style called kufi, just google it.

[https://www.google.com/search?q=arabic+calligraphy+kufi&clie...](https://www.google.com/search?q=arabic+calligraphy+kufi&client=firefox-
b-
ab&source=lnms&tbm=isch&sa=X&ved=0ahUKEwiHr_j9q8_fAhWBGewKHXa8BTcQ_AUIDigB&biw=1366&bih=667)

~~~
dhimes
Thank you.

------
Twisol
I am, in all likelihood, _not_ in the target audience (having already spent
much time with the material being presented), but this textbook has been a
_joy_ to read so far. I am only up to page 14, and I already need two hands
(in base 1...) to count the number of times I've laughed aloud or cheered a
particular point being raised.

Most algorithms books are _dry_ , or they're obsessed by particular formal
details, or (worse) they _implicitly_ include optimizations in the algorithm
without explaining what is needed for _correctness_ and what is needed for
_efficiency_. Having tutored on one of the books mentioned in the prologue, it
can be a real struggle to gain a true intuition for algorithms when you can't
yet tell the difference. But the sheer personality contained within this book
is infectious, and it really is something of a page-turner. (Lest you think I
haven't gotten to the stuff that "matters", I _have_ read through the chapter
on depth-first search -- again, an enjoyable read!)

What a great book. I'm definitely recommending this to my friends.

~~~
tptacek
Same. I fell in love with it instantly and permanently when I got to the "NO,
STOP, YOU'RE DONE" part in the Towers of Hanoi. I skipped around the book and
_the whole thing_ is like that. It's simultaneously concise and witty _in
service of the material_.

The chapter problems look fantastic, too, and are just as well-written (and
they cover a lot of ground).

It's really something!

------
otachack
I took Jeff's Algorithms class in college and he's one of the best professors
I've had. I still struggled through the class but definitely made it through
with a great experience.

~~~
kyteland
I took CS 173 and 373 from Jeff nearly 20 years ago, with 273 from another
great professor. That course sequence along with Combinatorial Game Theory
(which I took on a lark) has had a greater impact on me than the rest of my
university experience combined. A lot of that is due to the quality of the
professors I was lucky enough to have.

~~~
jimhefferon
Perhaps you might, when you get a chance, email him a brief note to tell him
that. That'll make a person's day.

~~~
mesaframe
You know what, your book on linear algebra is really great too. I tried to
understand LA from too many books but your book was the one which made sense
for me.

~~~
jimhefferon
:-) Thank you.

------
yoler
Isn’t this the guy that is famous for being admitted to a PhD program with an
exceptionally low GPA?

If so, why is he the exception and why aren’t more PhD programs looking for
non-traditional talent?

Edit: I read his blog post. It gave me more insight. It looks possible for
people with those sort of grades to be admitted even today, but they seem to
need a cheerleader on the inside that will help them.

~~~
davidgrenier
I graduated with a Comp.Sci bachelor with a similar GPA. Spent around 10 years
as a programmer in the industry and came back to get into the masters program.
I was almost laughed off (a good thing) stating I would have to do another
bachelor.

Sold my house, got rid of all my stuff and enrolled in pure math bachelor's.
Best decision of my life.

I though that in my mid 30s with a lot more discipline, being able to work
longer and harder would've made up for the shit grades. I'm extremely thankful
to the department head for waving me off like that because it would've been a
disaster.

Clearly Jeff could handle himself, however I think one must really know what
they're doing. I thought I'd end up doing the 3-year math program in 2...
it'll turn out to be 3.5 with good grades this time around. After that I know
I'll be very solid for master/phd programs.

~~~
b3b0p
> I graduated with a Comp.Sci bachelor with a similar GPA. Spent around 10
> years as a programmer in the industry and came back to get into the masters
> program. I was almost laughed off (a good thing) stating I would have to do
> another bachelor.

Can you expand on this? I'm similar right now. Want to go back and get my
masters. Graduated in CS about 12 years ago. Mid 30's. What do you mean about
having to do another bachelor? I have not heard this before.

> Sold my house, got rid of all my stuff and enrolled in pure math bachelor's.
> Best decision of my life.

I recently just bought a condo downtown. One because I never plan to move
again and rent goes up at least 3-5%/year and two because I can easily rent it
out if I do end up leaving. The University of Minnesota is only a mile away
from me at the moment. Mortgage is probably a little high for a grad
student/TA salary considering I'm making well in to the 6 digits right now.
This is my only debt and all my stuff is not much. Mostly things I can't
really sell for much now or are required to work and live (computer, desk,
monitor, clothes, eat, sleep). I have a couch and a TV to relax outside of
work.

~~~
davidgrenier
My GPA was 2.37 and I would've had to take so many classes to increase my GPA
up to the cut off point of 2.7 (B-). With 90 credits worth of classes, you'd
need 90 credits at 3.0 (B average) to make up for the difference, or less if
you get better grades and that means you're on the low end of what they
accept.

Now I presume if you did 2-3 semester with a B+/A- GPA, they would take that
as testifying you can handle the masters and let you in, however I'm enjoying
math way too much right now and there are several classes (Algebra 1-3,
Topology, Differential Equations and several classes in statistics I intend to
pick in the option block) that will come in handy if I head into ML masters
I'm also interested in Type Theory and all of Discrete Mathematics. Also, I
feel math is a great program to improve at general problem-solving.

The first semester was really challenging, some classes I had in fact already
done (Calculus 1 and Linear Algebra) turned out to be surprisingly difficult.
I'd say around halfway through the second semester I felt I had gotten my
younger brain back.

I don't have expensive habits, have two restaurants meal a week, own a car and
live at the university residences (rent is 390$/month, parking 850/year). My
cost of life seems to be around 12k CAD/year and am Canadian (tuition costs
are 1600/semester) so I really have no idea how much you'd have to have saved
up if you are in the USA.

~~~
b3b0p
My GPA 12 years ago when I finished was 3.5 I think. There about anyway. I
figured you were referring more to the length of time you were out of school,
not your GPA.

My mortgage is far north of $390, haha. I don't have or need a car though. I
lease out my parking spot for $150/month or more. If I go and get accepted
I'll have either work pay or try and see about scholarship or something. My
brother and best friend got free rides through their master and Phd.

------
brianzelip
> ́ Black spades indicate problems that require a significant amount of
> gruntwork and/or coding. These are rare.

> ∆ Orange stars indicate that you are eating Lucky Charms that were
> manufactured before 1998. Ew.

(p. vi)

:)

Really like the book already. The preface falls under the category of advice
on learning to learn. Thanks for all your work. #stealthisbook

~~~
ZeroCool2u
>Caveat Lector! >Of course, none of those people should be blamed for any
>flaws in the resulting >book. Despite many rounds of revision and editing,
this >book contains many mistakes, bugs, gaffes, omissions, >snafus, kludges,
typos, mathos, grammaros, >thinkos, brain farts, poor design decisions,
historical >inaccuracies, anachronisms, >inconsistencies, exaggerations,
dithering, blather, >distortions, oversimplification, >nonsense, garbage,
cruft, junk, and outright lies, all >of which are entirely Steve Skiena’s
fault.

Coming from a class where we used a decidedly poor textbook for algorithms,
this looks like a joy to read.

------
konart
Maybe that because english is not my native language, but I honestly find this
way of material presentation as rather confusing.

PS: oh and of course all those 'It’s quite easy to show that the...' and
similar 'by this point it should be obvious'. No, it is not... I guess I will
never understand or use those algorithms even though I'd like to.

~~~
jefferickson
Hi, I'm the author. If you find any particular claim of "obviousness" unclear,
submit an issue request!

But please don't confuse "straightforward" with "obvious". Sometimes I
deliberately gloss over mechanical details because I think they're a
distraction from the main point. I'm NOT claiming that you should immediately
know how to fill in the details; I'm claiming that filling in the details is
boring.

~~~
fwip
I would agree that words like "obvious" or "trivial" can be disheartening to a
learner.

If it's obvious to the reader, they know it whether or not you tell them it's
obvious. If it's not obvious to them, then they will try to figure it out.
Telling this person "it's obvious" only serves to make them feel bad about not
getting it right away.

~~~
mden
I felt that way when studying math in college, but after sometime I actually
came to like the use of "clearly" and the like as it can be used as a check on
whether you've spent enough time internalizing the previous information. It's
one thing to have a text or a person hold your hand through algorithms or
theorems, it's another to be able to do it yourself. So getting hit with a
"clearly" that feels unjustified is often a signal to let go of the guiding
hand and go back and review until the statement in question does become clear.

~~~
konart
Sometimes this is th'e case, yes, sometimes it is not.

An example:
[http://jeffe.cs.illinois.edu/teaching/algorithms/book/Algori...](http://jeffe.cs.illinois.edu/teaching/algorithms/book/Algorithms-
JeffE-2up.pdf) Page 15

> It’s quite easy to show that the singing time is Θ(n2); in particular,the
> singer mentions the name of a gift ∑ni=1i=n(n+1)/2times (counting
> thepartridge in the pear tree).

I'm pretty sure that I still remember how to read the formula and I even know
what does Θ(n2) means, but it's still unclear for me how do we get n^2 and
this formula from the "NDaysOfChristmas(gifts[2..n]):" example.

------
Zanta
I used these lectures as a method for learning algorithms as a phys/mech eng
with no CS background. I found them incredibly challenging but the lectures
were written exceptionally well. On an internet with thousands of resources on
this material, this was the very best I found.

------
azangru
From the second page (verso page?):

> Download this book at
> [http://jeffe.cs.illinois.edu/teaching/algorithms/](http://jeffe.cs.illinois.edu/teaching/algorithms/)
> or [http://algorithms.wtf](http://algorithms.wtf)

algorithms.wtf is a beautiful url!

------
raylangivens
I have been reading through these and find them really unique in a good way. I
got a new perspective on a lot of stuff I already studied I have a Bachelors
in CS and graduated 1.5 years ago.

It doesn't feel formal like a textbook and yet doesn't sacrifice on the
mathematical rigor. I would be trying out the exercise problems which seem
equally daunting but fun.

I also submitted an issue request on Github and Jeff I have a few questions
for you that I put on Quora, I have A2A'd you.

Lastly, thanks for taking the time out and putting content like this out for
free. It helps millions of autodidacts like me.

------
nameless912
Jeff's CS 374 was one of my favorite classes at U of I. It really opened up my
mind to thinking about Computer Science as a branch of mathematics, and
completely changed how I approach my work today.

The book form of these lectures was in EARLY rough draft when I took the class
though-we were proofing chapters for him!

------
kizer
I had him as a professor as well. He’s a brilliant guy and educator, but could
be a little rough with questions; i.e. making you feel a little dumb. His
notes though were always excellent.

------
mcguire
" _Sedgwick’s reformulation [of AA trees] requires that no right child is red.
Whatever. Andersson and Sedgwick are strangely silent on which end of the egg
to eat first._ "

------
azhenley
Jeff is also a frequent poster on Academia.StackExchange. He helped me quite a
bit by answering my questions about grad school life.

[https://academia.stackexchange.com/users/65/jeffe](https://academia.stackexchange.com/users/65/jeffe)

------
neurotrace
> Chapter 0 uses induction, and whenever Chapter n−1 uses induction, so does
> Chapter n

I love this.

------
mpurham
Wow students are so lucky for the sheer number of resources available today.
Great post and thanks for sharing!

------
webmaven
Great resource, though I wish it was available as an EPUB (or MOBI, or AZW3).

~~~
jefferickson
You can get an EPUB and MOBI versions from the Internet Archive (auto-
converted from my uploaded pdf), but I can't vouch for its quality.

All the fonts are baked into the PDF, so it should be readable anywhere; if it
isn't, please submit a bug report!

But if you're looking for a format that lets you reflow the text, by changing
the margins or font or text size, you're out of luck. The only way to write
something like that is to bake it in from the beginning. That's easy for pure
text, but hard to impossible for technical documents with lots of displayed
equations, big hard-formatted boxes of text (ie, algorithms), and the like.

(Boaz Barak managed it by writing his Modern Complexity Theory book entirely
in Markdown. The mind boggles.)

~~~
atombender
I don't know what you use to typeset the book, as I can't find any source
files, but isn't LaTeX really good at this?

I'm not sure what makes it hard to reflow. The text seems to be mostly
paragraphs with figures, inline formulas and footnotes; pretty basic stuff,
no?

(O'Reilly's books were famously typeset with Troff/Groff for many years,
though I don't know how painful that was or how advanced their typesetting
needs were.)

All I know is that I'd love a version that worked on smaller devices. I tried
the Kindle version reformatted by archive.org, but it has lost most of the
formatting, making it almost completely unreadable.

I'd especially like an HTML version of the book that was fully hyperlinked,
with zoomable vector figures, footnotes hidden away in popups, etc.

~~~
webmaven
_> (O'Reilly's books were famously typeset with Troff/Groff for many years,
though I don't know how painful that was or how advanced their typesetting
needs were.)_

I beleive that O'Reilly Media has used ASCIIDoc[0] as their preferred internal
format for quite a few years. Formulas are supported through ASCIIMthML[1].

More recently, they seem to have transitioned to HTMLBook[2] as a preferred
source format for systems like Atlas[3].

[0] ASCIIDoc is basically a Markdown-like (or ReStructuredText-like) format
that is feature-equivalent to DocBook XML:
[http://asciidoc.org/](http://asciidoc.org/)

[1]
[http://asciidoc.org/asciimathml.html](http://asciidoc.org/asciimathml.html)

[2]
[http://oreillymedia.github.io/HTMLBook/](http://oreillymedia.github.io/HTMLBook/)

[3] [https://atlas.oreilly.com/](https://atlas.oreilly.com/)

------
Topolomancer
I love Jeff Erickson's work and would also like to recommend his notes on
Computational Topology [1] for further consumption.

[1]:
[http://jeffe.cs.illinois.edu/teaching/comptop/2009/schedule....](http://jeffe.cs.illinois.edu/teaching/comptop/2009/schedule.html)

------
skizm
What the heck is the trivial one liner to check who will win a chess game
given both players play perfect?

~~~
plin25
I believe this is a trick question. The "standard" chess board is 8x8, with a
32 possible pieces. This gives you a constant (albeit absurdly large) number
of configurations, which means that the one-line solution "Brute Force" is an
O(1) algorithm.

------
BillyBreen
Love his writing style. Accessible yet detailed with fun historical tangents
to set context.

------
pseudonom-
Does anyone have any impressions on how this compares to CLRS?
([https://en.wikipedia.org/wiki/Introduction_to_Algorithms](https://en.wikipedia.org/wiki/Introduction_to_Algorithms))

~~~
rbkillea
By Ctrl+F'ing, I find 5 mentions of the word "master", none of which are the
master theorem. I prefer this to CLRS as, while it's a neat trick, it tends to
result in a bunch of people memorising the cases (and taking a "because the
book told me to" level of understanding away from that part of the course).

~~~
jefferickson
Exactly. The students should be the masters, not the theorem.

------
rbkillea
This has always bugged me: what reasons outside of convention do we have for
preferring O(V + E) to O(E) on algorithms that only make sense in the context
of a single connected component? I know this is slightly OT, but tangentially
related.

------
lbj
Absolute treasuretrove, thanks for sharing!

------
ausjke
After a quick browser it seems a little too academic for daily programmers
like myself.

Would love to learn more about practical dynamic programming these days, hope
there is a book about that extensively.

~~~
ibash
Go back and read it - it’s not!

------
sidravi1
The historic preludes to the chapters are awesome. The dynamic programming
chapter starts with an intro to the study of meters in Sanskrit poetry.

------
dksf
I love how he offers the mnemonic
[http://algorithms.wtf](http://algorithms.wtf)

Bravo, Jeff!

------
dotancohen
This is a great resource. Though it is dated this month, there seems to have
been a mention of it two years ago and a short HN discussion ensued:
[https://news.ycombinator.com/item?id=12873426](https://news.ycombinator.com/item?id=12873426)

------
mnadel
I had the privilege of being an undergrad in Jeff’s class in the late 90s.
IIRC, it was his first year teaching. On the last day of class he received a
standing ovation. The average grade on the final exam was ~50%. And yet he
received a standing ovation. He’s _that_ good.

------
curiousDog
The graphs chapter is my favorite. It's not very rudimentary like other text
books

------
nythrowaway
I managed to get a A- in Ericsson’s 373 as an undergrad (grads took the same
class but had their own curve) in 2001. Great class, great experience, great
teacher. Pushed me to do algorithms at a difficulty I didnt hunk possible.

------
yangdehang
I don't know why your guys think this book is so good, I don't see any new
things out of it nor could I appreciate how he composed the content. It is
just one of those random textbooks for this topic.

------
briefcomment
Are algorithms useful to learn for a non-programmer? Is there a benefit to
thinking through what is presented in a book like this over solving general
problems in a day-to-day context?

~~~
clishem
Please find something you love doing and learn more about that instead of
picking up random things.

~~~
briefcomment
Sound advice. Thinking through the scope of a programming problem to come up
with an algorithm sounds like a productive way of using your mind, but may be
a waste of time if you don't intend on programming.

------
wwarner
The section on dynamic programming is exceptionally clear.

------
setheron
I would appreciate a version of the book with the extended material as well!
Is there a way to pre-pay advanced copies of the published book?

------
sqlacid
Copyright question, the copyright page says draft published 12/29/2018, but
the copyright is 2019; doesn't that mean somebody could have taken the work,
published as their own as copyright 2018 on 12/30/2018 which would be earlier
than the author's 2019 claim? I'm not a copyright expert by any means, but am
always troubled when I see "Copyright $CURRENT_DATE" in electronic works, I
thought it was supposed to reflect the earliest date of claim.

~~~
jefferickson
Oops. I'd say submit a bug report, but as of yesterday it become moot.

As others have said, the copyright notice is only a courtesy/reminder. I've
held the copyright on all this stuff from the moment I started writing it (and
distributing it) 20 years ago.

------
kyriakos
great job and thanks for keeping this free

------
sidcool
How does this compare to CLRS? I am neck deep in reading CLRS..

------
vowelless
The lecture on Matroids is pretty good.

------
mlevental
gonna take this opportunity to ask for advice: i have an MS in CS and i've
gone through all of CLRS twice (yes really all of it and really twice - once
for my grad algos class and once in prep for interviews - and i still don't
have whatever intuition i need to be able to effortlessly do DP. it's honestly
kind of maddening - mincut/maxflow, RSA, knuth-morris-pratt etc are all
completely obvious to me and i can whip them out pretty much effortlessly -
but DP i struggle to find the optimal substructure and formulate bottom up
(yes i can memoize but that's not clever enough for you know who). what's the
magic combinatorial perspective/intuition that enables people to construct dp
solutions so quickly??? yes i've read vazirani and gone through the clemson
examples and etc. and skiena and sedgwick whatever but the problem is they're
mostly all rehashings of the same solutions/perspectives. looking forward to
this book's perspective.

~~~
jefferickson
Forget about efficiency for the moment and focus on discovering the underlying
recursive problem.

LOTS of people struggle with dynamic programming. But in my experience, 90% of
the difficulty with dynamic programming is actually discomfort with recursion,
which is why I talk about recursive backtracking first.

Try reading Chapter 2. My goal in that chapter is to show the process of
deriving recursive solutions---how to think about the problem, and what
questions to ask---rather than just presenting the solution as a fait
accompli.

I do have to assume that you believe in the Recursion Fairy, though. That's
probably the hardest step. Computer scientists are TERRIBLE at delegating.

~~~
DonaldPShimoda
> Recursion Fairy

I mentioned this elsewhere but figured I'd bring it up here since you're the
author.

I've tried explaining recursion in a lot of ways like this, such as "just
assume it will work", "trust yourself", and "turn off your brain". (The first
two are paraphrases from Matthew Flatt, the latter is from Will Byrd.)

I think "Recursion Fairy" is my favorite way to phrase the same idea. I think
there's something about the nature of invoking a sense of _magic_ in the
phrasing that might help people really believe that it's okay to just let the
recursion do its thing and not think about it too much. I'll definitely be
using "Recursion Fairy" when (if) I end up explaining recursion again.

Thanks for making your material available for free! Cheers!

