
It Is What It Is (And Nothing Else) – Why Recursion Still Matters - ingve
https://existentialtype.wordpress.com/2016/02/22/it-is-what-it-is-and-nothing-else/
======
pjc50
This is a prime example of the conflict between the "machine model" of
programming (a computer is a large and elaborate state machine that advances
instruction by instruction) and the "maths model" (by reasoning from a set of
axioms and rules of production we can transform information or prove facts).

The former maps onto imperative programming, and the latter onto functional
programming. It seems that different people find these models harder or easier
to understand - and then get confused when not everyone sees it that way.

~~~
astrobe_
I think that the confuse comes from the fact that one wants to mix both views;
after all, I don't remember see anyone being confused bu the classic recursive
definitions (Fibonacci, factorial) in my school days. IIRC the factorial
function is introduced in high school. It can literally be understood by a
_kid_.

I believe the reason why people want to expose both views at the same time is
that the math view doesn't really explains "how it works" as far as CS is
concerned.

As far as I'm concerned, I can't imagine programming without the knowledge of
how things work at assembly level. I think I would feel so crippled in my
understanding that I would want to learn about assembly... again. I would
probably not stand working with a machine that works on magic.

~~~
throwawayukcyb
_> I can't imagine programming without the knowledge of how things work at
assembly level._

The author's point is that recursion is a pervasive _concept_ that is
essential to understand even how modern computer hardware works:

 _> the very concept of a digital computer is grounded in recursive self-
reference (the cross-connection of gates to form a latch), which, needless to
say, does not involve a stack. Not only do real programmers use recursion,
there could not even be programmers were it not for that._

The author is not suggesting that we forget about "what's really going on".
He's rightly pointing out that you actually cannot understand what's really
going on at all without first understanding recursive self-reference.

~~~
michaelt
Personally, I think that analogy conceals more than it reveals. Recursion in
computer software requires nested definitions that eventually reach a simple
base case resulting in termination. An sram cell or J/K flip-flop, on the
other hand, doesn't involve nesting or reach a simple base case.

In my mind recursion is a snake that has eaten a smaller snake, whereas an
sram cell is a snake eating its own tail.

~~~
chriswarbo
> Recursion in computer software requires nested definitions that eventually
> reach a simple base case resulting in termination.

I think you've just illustrated the article's point.

You're describing well-founded recursion, which is very useful, but doesn't
include e.g. continuation-passing (
[https://en.wikipedia.org/wiki/Continuation-
passing_style](https://en.wikipedia.org/wiki/Continuation-passing_style) ) or
co-recursion (
[https://en.wikipedia.org/wiki/Corecursion](https://en.wikipedia.org/wiki/Corecursion)
), hence perpetuating the myths the article is complaining about.

The really unfortunate thing is that sub-sets of these ideas keep getting re-
invented (AKA "reinventing the square wheel"), for example exception handlers
in place of continuations, iterable objects in place of co-recursive data,
etc.

As for J/K flip-flops, they're recursive because their output is their own
input. For example, given co-inductive stream of `j` and `k` values (the flip-
flop inputs), we can generate a co-inductive stream of outputs `q`:

    
    
        function flipflop(init_js, init_ks) {
          function ff(js, ks, q_old) {
            var j     = car(js);
            var k     = car(ks);
            var q_new = j * not(q_old) + not(k) * q_old;
            return cons(q, ff(cdr(js), cdr(ks), q_new);
          }
          return ff(init_js, init_ks, 0);  // Initiate the co-recursion with 0, arbitrarily
        }
    

Whilst I've used co-recursive data for convenience, the fact that the result
`q_new` becomes the argument `q_old` for the next call is unavoidably
recursive.

------
davexunit
It's interesting to use languages that do not harshly penalize recursion by
imposing stack size limits. The upcoming GNU Guile 2.2 release, a Scheme
implementation, has a dynamically expandable stack so algorithms that are
naturally recursive can be implemented that way rather than contorted into
iterative form to please the stack limit god.

For example, in Guile 2.0, this natural definition of 'map' will cause the
stack to overflow on large lists:

    
    
        (define (map proc lst)
          (if (null? lst)
              '()
              (cons (proc (car lst))
                    (map proc (cdr lst)))))
    

The iterative solution is to construct the list backwards and reverse it as a
final step:

    
    
        (define (map proc lst)
          (let loop ((lst lst)
                     (result '()))
            (if (null? lst)
                (reverse result)
                (loop (cdr lst) (cons (proc (car lst)) result)))))
    

There's no real gain here in terms of performance or readability. You trade a
large stack for a large reverse operation at the end. Guile 2.2 allows a whole
class of recursive algorithms that were not practical before because of the
fixed size stack (like most languages I know of), and they perform well, too!

Needless to say I agree with this article. Recursion is an integral part of
computing, and more programming languages should provide better support for
it. Embrace it, don't avoid it!

~~~
AnimalMuppet
But can you do that with something like fibonacci? You can't just tail-call
optimize that away, can you, since there are two recursive calls?

~~~
chriswarbo
I was under the impression that Schemes generally compile down to
continuation-passing-style, and hence in the case of something like Fibonacci
one of the recursive calls would be performed as a tail call, whilst the other
would become part of the continuation, e.g. something like:

    
    
        function fib(n, k) {
          if (n == 0) k(1);
          if (n == 1) k(1);
          fib(n-1, function(f_1) {
                     fib(n-2, function(f_2) {
                                k(f_1 + f_2);
                              });
                   });
        }

------
nbevans
I was thinking about this the other day too. Recursion in CS101 is taught in
such a bizarre way. At least it was back when I was in CS101. It was almost as
though CS tries to discourage its usage simply because "most minds cannot
understand it as easily as iterative loops".

~~~
rer0tsaz
Iterative loops are much easier to analyze and reason about, by humans and by
computers. For example, they always terminate. To paraphrase Dijkstra, "I
regard general recursion as an order of magnitude more complicated than just
repetition, and I don't like to crack an egg with a sledgehammer."

[https://tinyletter.com/programmingphilosophy/letters/i-don-t...](https://tinyletter.com/programmingphilosophy/letters/i-don-
t-like-to-crack-an-egg-with-a-sledgehammer)

~~~
mafribe

        Iterative loops are much easier to analyze and reason about,
    

This is not the case. Loops and recursion can be translated into each other,
hence are of equal complexity in terms of reasoning. And you see that when you
write down the Hoare-logic axioms for either.

What Dijkstra had in mind was probably simple, syntactically constrained forms
of loops, like for-loops where the loop variable is read-only in the loop
body. They are simpler than general recursion, sure. But there are
corresponding simpler forms of recursion, e.g. primitive recursion or tail
recursion that are much simpler than general recursion.

~~~
AnimalMuppet
> Loops and recursion can be translated into each other...

True.

> ... hence are of equal complexity in terms of reasoning.

False. That's like saying that Stokes' Theorem says that the integral of a
differential form over the boundary of a manifold is equal to the integral of
it's derivative over the whole manifold, and therefore the two integrals are
equally easy to do. In fact, the two integrals are often not equally easy to
do, _which is the primary practical reason that we care about Stokes '
Theorem_ \- it gives us a way to convert hard integrals into easier ones.

You cannot use formal equivalence to prove practical ease of doing the
problem.

But the situation isn't even that simple. A blanket statement that "loops are
much easier to analyze and reason about" is _also_ false. It depends on the
situation, and often on who's doing the reasoning. Different people think in
different ways.

~~~
Jtsummers
> You cannot use formal equivalence to prove practical ease of doing the
> problem.

Agree completely, but want to bring it back to another computing topic:
Technically, anything that can be computed with one Turing-complete language
can be computed with another. And every language _clearly_ is not equivalent
in terms of ease of understanding and reasoning about as every other language.

Trivially, compare computations in brainfuck to computations in C. Compare the
recursive definition of tree traversal available in lisps to languages without
recursion (some BASIC variants and others) where you have to jump through
hoops and self-manage a stack.

~~~
mafribe

        compare computations in brainfuck to computations in C. 
    

I'd suggest that this is an orthogonal issue. It's better to compare two
otherwise identical languages, one where Turing-completeness is achieved by
loops, another where the same is done by recursion. (Or a language that offers
both). Then _formal_ reasoning is of the same complexity.

------
munin
only bob harper would say that the stack is difficult for students to
understand but propose structural operational semantics as a solution.

------
vdnkh
I suppose I had trouble with recursion until I understood that each call
should strive to be stateless - meaning, you don't try and remember previous
results. Each new call should be just the same as the initial. You only act on
the current set and try to reduce the problem further. Trying to remember
previous iterations results in a lot of headaches.

------
scotty79
> Implementing recursion does not require a stack, nor does it require any
> special provisions for managing the stack.

When I wanted recursion in Atari BASIC it involved implementing stacks for all
my local variables (because there's not such thing as local variable in Atari
BASIC).

Thinking about the task in terms of bits of work that should be put on a stack
or in the queue felt way more controllable than recursion.

~~~
LoSboccacc
Once I needed to reimplement recursion in mathlab because the stack was too
small, it was an interesting experience, I had a matrix-repurposed-as-stack
and did all the push/pop manually while the recursion ran as a loop - goes to
show that recursion isn't that essential if one knows how to split state, data
and code properly.

------
venomsnake
Recursion is just induction backwards.

But you need to know what a stack is - because it gives you the practical
depth to which you can recurse. Ditto with tail recursion optimization.

And that is why C should be mandatory. You must be able to translate the
mathematical concepts into something real and tangible.

