
How recursion got into Algol 60: a comedy of errors - sytelus
https://vanemden.wordpress.com/2014/06/18/how-recursion-got-into-programming-a-comedy-of-errors-3/
======
kabdib
I've encountered knee-jerk negative reactions to recursion from most firmware
engineers.

Then one day I got one to laugh at this byte-saving routine to do some
multiplications:

    
    
        X32:    BSR X16
        X16:    BSR X8
        X8:     BSR X4
        X4:     BSR X2
        X2:     ADD A
                RET
    

Recursion. In assembly language. We were six bytes shy of filling our code
space, but we had gobs of stack, and I got about 60 bytes back from our stupid
compiler with this. When my cow-orker saw it, he thought for a moment, smiled,
said "You're one sick dude, you know that?" and we checked it in.

~~~
hellbanner
Can you explain this a bit? I read
[http://docs.oracle.com/cd/E19455-01/806-3773/instructionset-...](http://docs.oracle.com/cd/E19455-01/806-3773/instructionset-90/index.html)
but I'm not sure I follow what X32 is.. times 32?

~~~
shiro
"X32:" etc. are labels, e.g. just names of functions or destinations of jump.
Suppose you want to multiply A-reg by 4. Your code calls subroutine X4. That's
immediately calls subroutine X2 (BSR X2), which doubles A (ADD A), and return.
The return location is the next instruction of BSR X2--which is actually
overlapping the subroutine X2. It doubles A again (ADD A), and returns to your
code.

------
dragonwriter
Shouldn't this be "how recursion got into Algol 60"? After all, part of the
background is that McCarthy wanted it in Algol, _coming off his success with
Lisp_ , that had, in its earliest form, _only_ recursion and no iteration. So
recursion had already "got into programming" when this story of "How recursion
got into programming" kicks off.

~~~
sytelus
One can argue that many features of programming languages have existed in
theory since days of Euclid and Archimedes. This article argues that ALGOL was
the first attempt to "standardized" features of a programming language that
was expected to be in wide spread usage. It was designed by committee of big
name experts and the "comedy" part is that lot of these people considered
recursion as unnecessary or even harmful. No one knows what would have
happened if ALGOL indeed didn't supported recursion, became widespread and was
considered as gold standard. How many years before someone would have broken
the barrier and said experts were wrong and recursion is one of the most
important part of language?

~~~
dragonwriter
> How many years before someone would have broken the barrier and said experts
> were wrong and recursion is one of the most important part of language?

Lisp was used quite a lot even after Algol 60 was standardized; presumably
that would have occurred at least as much if Algol 60 was standardized without
support for recursion -- inevitably, actually, _moreso_ if Algol 60 didn't
support recursion.

Maybe, including recursion in Algol 60 didn't make recursion mainstream, it
just stopped Lisp from being more mainstream and made Algol 60 and its
descendants successful.

Really, its hard to know.

------
dvt
Ah recursion, my old nemesis. We meet again. Very interesting and informative
article, but I'm with the SUBSET people on this one:

> Do not write recursive procedures. Do not use procedures recursively.

I've probably failed 90% of recursion "gotcha'" questions I've had during
interviews. It's my uninformed opinion that recursion is uninteresting and
unintuitive. This is why whenever I encounter it, I either ignore it or
refactor it. I'm exaggerating, of course, but I really do hate it. There are
some very interesting theoretical applications, however. The article briefly
mentions Lambda Calculus (which I love) where I've always enjoyed playing with
the divergent Omega combinator[1]:

Ω = (λx. x x) (λx. x x)

I'm may be entirely too stupid to appreciate the beautiful intricacies of
recursive programming, but I do think that recursive programming can be
susceptible to some unique bugs, such as exceeding stack/heap, that are very
difficult to notice prima facia (more difficult than when the same is done via
procedural code, at least).

[1]
[http://www.seas.upenn.edu/~cis500/cis500-f06/lectures/0925-2...](http://www.seas.upenn.edu/~cis500/cis500-f06/lectures/0925-2x3.pdf)

~~~
baddox
I find recursion extremely intuitive, if not necessary, for pretty much any
data structure with a bunch of things ("nodes") pointing to other things. That
includes, most notably, linked lists and trees. I can probably work up
recursive pseudocode for tree traversals pretty easily, despite not having
studied computer science or doing it at work for several years, yet I find an
iterative solution somewhat elusive and unnatural.

~~~
Veedrac
Funnily, I personally think linked lists are unintuitive.

"We have a sequential block of memory, and we want to store a sequence in
it... hrm... I know, why don't we put the first element in this struct, and...
I'll deal with the rest of the items elsewhere."

~~~
Jtsummers
Linked lists would only be unintuitive if you wanted to apply them to a
sequential block of memory like that. Linked lists are used in non-sequential
blocks of memory, though you can use them in sequential blocks. It sometimes
makes sense to, but really only in an embedded systems context where you're
doing some other fun things like writing your own memory management system
using preallocated arrays as heaps so that your memory usage doesn't grow
unbounded.

~~~
Veedrac
I'm aware there are circumstances where linked lists are useful, but IMHO
they're rarely introduced alongside a context that justifies them. Unintuitive
doesn't mean bad, FWIW.

~~~
Jtsummers
I wrote that after a very, very long day and long after I should've gone to
bed (stupid needing to wear clothes, and not wanting to wear dirty clothes).

The main point I wanted to make was that you described linked lists as
unintuitive specifically in the context of having access to sequential memory
space. Linked lists are not, typically, used in that circumstance. They make
much more sense, and I think become more intuitive, when you consider them in
the context of non-sequential memory space.

------
spdionis
I don't understand what's weird and special about recursion other than the
ways it can be used to easily solve some problems.

I would find it _very_ unintuitive that in a language I couldn't write a
function that calls itself. That feels like an obvious thing (if we ignore all
the implementation details).

When I first started programming I used recursion without knowing that it's
called like that and I didn't expect it to be called in any special way at
all.

~~~
munificent
> I don't understand what's weird and special about recursion

Here's one way to see why recursion is special. Look at it from the
perspective of implementing the language.

If your language _doesn 't_ have recursion, you can treat every local variable
in your program as if it were a static global variable. It's local in terms of
_name scope_ , but you can give it a fixed blob of memory that you pick _at
compile time_.

This is exactly what early Fortran compilers did. foo()'s local variables go
_here_ , bar()s go _here_ , etc. You can do away with the stack entirely, I
believe.

So, without recursion, you can make your entire program only use statically
allocated memory. You can be sure it will never, ever _ever_ run out of
memory.

~~~
sn41
Thank You for that interesting view. Do you have a reference for the early
Fortran compiler design? I've always wondered why people say Fortran is
blazing fast, even compared to C.

~~~
munificent
I don't offhand, but that's what the article is referring to when it says:

> And this is what they wanted to remove because they wanted static allocation
> of procedure activation records.

From what I can tell with a bit of digging, machines/compilers back then
didn't even use return stacks for _subroutines_. Instead, they used self-
modifying code (!).

To return from a subroutine call, you'd do something like this:

1\. When compiling a subroutine, add a JUMP followed by an address to the end.

2\. When you call a subroutine, _modify the code_ to change that address to
point back to the callee.

3\. Jump to the subroutine.

4\. When it's done running, it hits the last JUMP and jumps back to the
caller.

With this, you don't need any concept of a call stack at all. But it obviously
prohibits recursion since the later call would trash the earlier one's return
address.

~~~
mhink
So instead of a call _stack_ , you have a call _linked list_? That's a pretty
wild world to imagine.

~~~
FullyFunctional
Nothing wild about that. Implementations using frame pointers effectively have
exactly that and in some architectures, like SPARC, this is mandatory.
Furthermore, some functional languages (SML and some Scheme implementations)
do this explicitly. For one this, it make call-with-current-continuation much
simpler to implement.

------
FullyFunctional
This is a well known story. I do take issue with the arrogance of the author:
"Of course we cannot blame the Bauer faction for not having the experience of
later workers in the field."

Oh _please_! The reality was the the common machines of the time didn't have
todays fancy addressing modes. All memory addresses were global addresses.
Thus, the _cost_ of making all memory access relative to a stack pointer was
far from trivial. Of course, there was a work-around if you could forgo re-
entrance (~ "thread-safety"): use static activation records, but saving it to
the stack before the recursive call and restoring after. Still not exactly
free.

~~~
FullyFunctional
Oops, that static activation records would probably not work with the Algol 60
parameter semantics so the original point stands: supporting recursion is
really expensive on that breed of hardware.

------
sytelus
Funny side story is that Hoare couldn't publish algorithm for QuickSort until
he learned ALGOL was ready to do recursion. QuickSort is one of the algorithms
that is very difficult to do without having a concept of recursion. Another
one is various tree traversals. What are the other algorithms that would be
impractical without recursion?

PS: Using stack to simulate recursion doesn't count :).

~~~
copperx
The implementations of non-recursive Quicksort and tree traversal are not
impractical, just a little bit longer.

By the way, are recursive versions of sorting algorithms even shipped with
standard libraries of popular languages?

~~~
sytelus
Yes, Java as well as .Net implementations used recursive QuickSort (I think
Java recently switched to Timsort?). The implementation however used only one
recursive call instead of two as in textbooks because second call is tail
recursive and can be made iterative. This is however not done just to avoid
recursion. With QuickSort you run risk of using up O(n) stack space in worse
case. Making second call iterative allows to make sure recursion always occur
on smaller partition thus guerenteeing O(lg n) space.

------
kazinator
> _In fact, the first Lisp had no iteration, so that the only way to add all
> elements of a linear list was to write a recursively defined function._

So that would be how recursion "got into programming".

The comedy of errors is how recursion got into Algol, not into programming.

------
jessaustin
_Committee member F.L. Bauer registered his protest by characterizing the
addition of recursion to the language as an "Amsterdam plot"._

Is "Amsterdam plot" an idiom, or simply a colorful expression that occurred to
Bauer? I haven't seen this expression before, but TFA repeats it several times
in stylized fashion so I thought maybe I was missing the intended meaning?

~~~
bunderbunder
I'm guessing it's a reference to two of the conspirators (Dijkstra and van
Wijngaarden) being Dutch.

~~~
c3534l
That doesn't answer whether or not "Amersterdam plot" was an existing idiom.
I've never heard of it personally.

~~~
bunderbunder
Ah, gotcha. No, the use of a direct quotation implies that it's not a common
idiom - if it were then that wouldn't be necessary.

------
mjevans
I am quite thankful to those who did the right thing.

However at the same time, this same process is used for evil quite often; as
the US Congress clearly provides many examples.

------
jrapdx3
The article was interesting and entertaining too. Good (and bad) committee
outcomes happen as described, when one or a few individuals seize the
initiative, setting out their ideas in a more or less finished form. Other
committee members may feel blindsided but in the crunch there's insufficient
time to produce alternatives. It's a lesson worth our attention.

The story evoked a personal element, going back to when I was first learning
programming in the 80's. Early on one of the serious hurdles I encountered was
recursion. (Another was pointers and indirection.) After struggling to wrap my
brain around the thing that kept calling itself, I ran across a book,
"Thinking Recursively" by Eric S. Roberts, 1986, which was enormously helpful.

I consider the book a classic. And it must be since it's still in print! I'd
_highly_ recommend it for anyone who finds recursion unintuitive. Way back
then it wasn't intuitive to me either, but with the book as a guide in time it
became as natural as breathing air.

------
ilaksh
But Lisp already had recursion, right? It was just Algol and the international
standard that was held up.

Similar to the situation with things like CoffeeScript vs. ES6 etc.

What I think we need is a standard so flexible and meta that systems can
evolve without waiting for society or standards to catch up. Which we have in
Turing-complete languages.

But we should standardize somehow on a very meta level.

~~~
Gibbon1
Winging off a comment above, I find recursion generally intuitive,
occasionally useful, and really really boring.

Offhand years ago I used a dialect of C that had no recursion so I can see
both sides of this argument. On the one hand, there is a pain point when you
don't have access to recursion. On the other, boy can a compiler that eschews
recursion produce small and fast code. (Because without recursion the call
graph is acyclic)

~~~
anindyabd
I wouldn't call recursion boring. Quicksort in Haskell is probably my favorite
two lines of code:

    
    
      quicksort [] = []
      quicksort (x:rest) = quicksort [y | y <- rest, y <= x] ++ [x] ++ quicksort [y | y <- rest, y > x]

~~~
nemesisrobot
It's been discussed before, but that's not really quicksort though (since it's
not in-place)[0]

0:
[http://stackoverflow.com/q/7717691/1235548](http://stackoverflow.com/q/7717691/1235548)

~~~
emmelaich
Do we know that for sure? I mean - could a sufficiently smart compiler
actually create machine code which _did_ do it in-place?

------
rootedbox
[https://news.ycombinator.com/item?id=10131664](https://news.ycombinator.com/item?id=10131664)

------
joeblau
> In the late 1950s a committee was formed to design a universal, machine-
> independent programming language.

I swear we still have not learned from our mistakes; Although now its
_platform-independent_ with the platforms being mobile, desktop, and web.

~~~
bunderbunder
I'm not sure the dream was such a mistake. We now have that language, though
it may not be quite what Algol's designers were hoping for. It's C.

------
hello0
great article

