Then the author introduced the iterative Fibonacci algorithm, and formally proved that the iterative and recursive algorithms were equivalent. But the author failed to explain where that iterative algorithm came from: there was no general purpose tool like memoization that could be used to derive the iterative form from the recursive form. I still remember the intellectual disappointment.
Now, with co-recursion and a couple of other techniques, that gap is finally bridged. Using co-recursion, I can derive the stream algorithm directly from the recursive definition. And then I can apply deforestation techniques and tail-call optimization to derive the iterative algorithm from the stream algorithm. That's a pretty powerful intellectual toolset, don't you think?
I think I've read that book too.
For the benefit of anyone following along at home: The unproven-but-accepted Church-Turing thesis states that any algorithm that can be done on a Turing machine (i.e., a Turing-complete language) can be represented as a recursive function and a function in the lambda calculus, and vice-versa.
And then I can apply deforestation techniques and tail-call optimization to derive the iterative algorithm from the stream algorithm.
How do you use deforestation and tail-call optimization to derive an iterative function from a stream in the general case?
You're also pulling a false comparison: you've jumped from a 'general' algorithm to "what a corecursive function is evaluated as under Haskell's semantics".
Corecursion builds a data structure. A TCO function, for example, won't produce that kind of output. The corecursive function could only be directly equivalent to the linear time, constant memory TCO function in a lazily-evaluated runtime (if that's true - tail recursive functions in Haskell can actually blow up the stack due to lazy evaluation).
How do you cut an iron bar in half with a pair of scissors?
Deforestation is a tool; not a universal truth.
You're also pulling a false comparison ... The corecursive function could only be directly equivalent to the linear time, constant memory TCO function in a lazily-evaluated runtime
But I'm using a lazily evaluated runtime. My code is Haskell. And in any case, I used deforestation in the last step to remove the lazily evaluated data structure.
If I understand you correctly, you're telling me that high-level mathematical formalisms work better in Haskell. Yes, they do.