This article does a good job laying the foundation of why I think homiconic languages are so important, and doing AI in languages that aren't, are doomed to stagnation in the long term.
The acrobatics that Wolfram can do with the code and his analysis is awesome, and doing the same without the homoiconicity and metaprogramming makes my poor brain shudder.
Do note, Wolfram Language is homoiconic, and I think I remember reading that it supports Fexprs. It has some really neat properties, and it's a real shame that it's not Open Source and more widely used.
I don't know how to express my thoughts coherently in such a small space and time, but I will try. There isn't "one" example.
----------
Almost all the code and its display is some form of meta-programming. Stephen Wolfram is literally brute-forcing/fuzzing all combinations of "code".
- Permuting all the different rules/functions in a given scope
- evolutionary adapting/modifying them
- graphing and analyzing those structures
- producing the HTML for display
I get that "normal machine learning" is also permuting different programs. But it's more special when you are using the same language for the whole stack. There is a canyon that you have to cross without homoiconicity, (granted I don't know exactly how Wolfram generated and analyzed everything here, but I have used his language before, and I see the hallmarks of it).
I can't really copy and paste an example for you, because plaintext struggles. Here is an excerpt some fanciness in there:
And as an example, here are the results of the forward and backward methods for the problem of learning the function f[x] = <graph of the function> , for the “breakthrough” configurations that we showed above:
You might see a "just" a small .png interspersed in plain text. The language and runtime itself has deep support for interacting with graphics like this.
The only other systems that I see that can juggle the same computation/patterns around like this are pure object oriented systems like Smalltalk/Pharo. You necessarily need first class functions to come even close to the capability, but as soon as you want to start messing with the rules themselves, you need some sort of term re-writing, lisp macro, or fexpr (or something similar?).
Don't get me wrong, you can do it all "by hand" (with compiler or interpreter help), you can generate the strings or opcodes for a processor or use reflection libraries, generate the graphs and use some HTML generator library to stitch it all together. But in the case of this article, you can clearly see that he has direct command over the contents of these computations in his Wolfram Language compared to other systems, because it's injected right into his prose. The outcome here can look like Jupyter labs or other notebooks. But in homoiconic languages there is a lot more "first-class citizenry" than you get with notebooks. The notebook format is just something that can "pop out" of certain workflows.
If you try to do this with C++ templates, Python Attribute hacking, Java byte-code magic... like... you can, but it's too hard and confusing, so most people don't do it. People just end up creating specific DSLs or libraries for different forms of media/computations, with templating smeared on top. Export to a renderer and call it a day -> remember to have fun designing a tight feedback loop here. /s
Nothing is composable, and it makes for very brittle systems as soon you want to inject some part of a computation into another area of the system. It's way way overspecified.
Taking the importance of homoiconicty further, when I read this article I just start extrapolating, moving past xor or "rule 12", and applying these techniques to the symbolic logic, like Tseltin machine referenced in another part of this thread: https://en.wikipedia.org/wiki/Tsetlin_machine
It seems to me that training AI on these kinds systems will give them far more capability in producing useful code that is compatible with our systems, because, for starters, you have to dedicate less neuronal connections on syntax parsing with a grammar that is actually fundamentally broken and ad hoc. But I think there are far deeper reasons than just this.
----------
I think it's so hard to express this idea because it's like trying to explain why having arms and legs is better than not. It's applied to every part of the process of getting from point A to point B.
Also, addendum, I'm not 100% sure homoiconicity it "required" per se. I suppose any structured and reversible form of "upleveling" or "downleveling" logic that remains accessible from all layers of the system would work. Even good ol' Lisp macros have hygiene problems that can be solved, e.g. by Racket's syntax-parse.
The acrobatics that Wolfram can do with the code and his analysis is awesome, and doing the same without the homoiconicity and metaprogramming makes my poor brain shudder.
Do note, Wolfram Language is homoiconic, and I think I remember reading that it supports Fexprs. It has some really neat properties, and it's a real shame that it's not Open Source and more widely used.