Here is a picture of the author of that paper working on the CADR machine (the successor to the CONS machine): http://www.computerhistory.org/chess/stl-431614f64ea3e/
This tree can be directly represented as a binary string which represents the DNA of the genetic algorithm.
So evolution is easy to simulate by randomly mutating nodes in the tree. Other operations like crossover (exchanging branches) and sexual reproduction are also straightforward.
Personally, I think that genetic algorithms are more approachable than neural nets and would like to see them emphasized more in machine learning (especially with modern video cards). When you think about it, both approaches are searching for local maximums/minimums in an effectively infinite search space. They are basically solving a giant matrix to find the function that maps inputs to outputs. Sure genetic algorithms generally take more computing power, but that becomes less of a problem with each passing year. Given enough time, evolution always wins!
Google "microcode" and you'll get a start. Hennessy and Patterson will carry you further.
It wasn't just Lisp or Smalltalk machines -- by the late 80s, early 90s, anything that wasn't an x86 machine was pretty much wiped out apart from the small percentage of the market that Apple held. Things that had been quite superior, like the Amiga's custom graphics chipset, the Motorola 68k CPU lineup, the DEC Alpha processor couldn't compete on cost at all -- let alone bespoke, effectively custom, architectures with exotic microcoded support for Lisp or Smalltalk, etc.
Basically, lisp machines were expensive workstation computers, and PCs killed all workstation manufacturers. Sun SPARCStations and Silicon Graphics machines bit the dust as well. Nobody could justify the cost anymore.
In most cases, yes, but the language also found its way into some very unexpected places. For example, a subset of Common Lisp, "L", was at the core of Prof. Rod Brooks' Subsumption Architecture which enabled the amazing mobile robots built in his lab. He ran Lisp on 16mhz 68332 embedded processors. https://www.researchgate.net/publication/2949173_L_--_A_Comm...
I do wish someone would revisit these machines in the modern era, but it looks like the more conventional RISC-V will be the Linux of chips. It seems like you could build a Connection Machine on a chip these days.