Deforestation is basically eliminating intermediate data structures, which is similar to what the "int(s.split("-", 1))" versus "atoi(strchr(s, '-') + 1)" slides are about. If you consider strings as just lists of characters, then it's basically a deforestation problem: the goal is to eliminate all the intermediate lists of lists that are constructed. (It's something of a peculiar case though, because in order to transform into the C code you need to not only observe that indexing an rvalue via  and throwing the rest away means that the list doesn't have to be constructed at all, but you also need to allow strings to share underlying buffer space—the latter optimization isn't deforestation per se.)
I don't know if there's been much effort into deforestation optimizations for dynamic languages, but perhaps this is an area that compilers and research should be focusing on more.
On another minor note, I do think that the deck is a little too quick to dismiss garbage collection as an irrelevant problem. For most server apps I'm totally willing to believe that GC doesn't matter, but for interactive apps on the client (think touch-sensitive mobile apps and games) where you have to render each frame in under 16 ms, unpredictable latency starts to matter a lot.
As for GC, it would be nice to have good real time GCs in runtimes.
After decades of GC research, I think the conclusion is, "Yeah, that would be nice." Current state of the art gives us some very nice GCs that penalize either throughput or predictability. One of my favorite stories about GC is here:
Deforestation is /more useful/ in strict languages, because allocation of temporary structures costs more. So fusion on strict arrays is better than on lazy streams.
You just can't do it unless you can freely reorder multiple loops, and to do that you need a proof there are no side effects. Haskell just makes that trivial.
You can also do it in stream-based or data-flow-based languages. Or in pretty much any DSL you decide to implement, if the semantics of the language itself is reasonable.
When incompetent programmers go away perhaps. C in and of itself is not the issue. And neither is scaring people away from it with horror stories.
I've run into young engineers recently who thing that pretty much any 'C style' system call is necessarily dangerous, because OMG the developer has to remember to pass in the length of the buffer they're passing in as well as the buffer itself. No, you just have to not be a frickin' idiot.
We need languages that are not designed to allow security exploits by accident, like Ada, Modula-2 or any other in the same school of thought.