Once a year or so I find myself on those forums and I'm always astounded how many people there are that dedicate massive amounts of time and brain power to this.
I think it appeals to the same itch that languages like Brainfuck scratch.
There's something exceedingly interesting about how you can model complexity with something extremely simple. Brainfuck is fun because it forces you to think extremely low level, because ultimately it is basically just a raw implementation of a Turing machine. I wouldn't want to write a big program in it, but it is fun to think about how you might express a complicated algorithm with it.
Similarly with CGOL, it is really interesting to see how far you can stretch really simple rules into something really complex.
I've written CGOL dozens of times, it's a common project that I do to "break in" a language I've learned, since it's not completely trivial but it's simple enough to not be frustrating, and I completely understand why math/computability-theory folks find it something to dedicate brain power to.
I have a weird love for Brainfuck. It's a tiny, incredibly simple language that you can write an interpreter for in an hour, it's effectively a super-simple byte code that can easily be extended and used as a compilation target for simple languages.
Honestly, as an educational tool, the only thing wrong with it is the name!
I have no idea how I'd be able to pitch this to a university (or even who I could pitch it to), but I would absolutely love to teach a computability course using Brainfuck as the language, just to really show students how low-level logic can be.
I would probably need to find a similar language with a different name though.
Assembly is higher level logic than brainfuck, especially on modern chips. You have built in instructions for arithmetic and conditionals/branches and you can allocate memory and point to it.
You don’t really get any of that with brainfuck. You have a theoretical tape and counters and that’s basically it.
if you find that fascinating then you'll be blown away by something called 'Wolfarm physics project'. it basically is trying to recreate entire physics using such baseline 'graph update' rules like 'Game of Life'. So far no predictions yet but very interesting.
Wolfram is kind of obsessed with cellular automata, even went and wrote a whole book about them titled "A New Kind of Science". The reception to it was a bit mixed. CA are Turing-complete, so yeah, you can compute anything with them, I'm just not sure that in itself leads to any greater Revealed Truths. Does make for some fun visualizations though.
A new kind of science is one of my favorite books, I read the entirety of the book during a dreadful vacation when I was 19 or 20 on an iPod touch.
It goes much beyond just cellular automata, the thousand pages or so all seem to drive down the same few points:
- "I, Stephen Wolfram, am an unprecedented genius" (not my favorite part of the book)
- Simple rules lead to complexity when iterated upon
- The invention of field of computation is as big and important of an invention as the field of mathematics
The last one is less explicit, but it's what I took away from it. Computation is of course part of mathematics, but it is a kind of "live" mathematics. Executable mathematics.
Super cool book and absolutely worth reading if you're into this kind of thing.
I would give the same review, without seeing any of this as a positive. NKS was bloviating, grandiose, repetitive, and shallow. The fact that Wolfram himself didn’t show that CA were Turing complete when most theoretical computer scientists would say “it’s obvious, and not that interesting” kinda disproves his whole point about him being an under appreciated genius. Shrug.
The question really ultimately resolves to whether the universe can be quantized at all levels or whether it is analog. If it is quantized I demand my 5 minutes with god, because I would see that as proof of all of this being a simulation. My lack of belief in such a being makes me hope that it is analog.
Computation does not necessarily need to be quantized and discrete; there are fully continuous models of computation, like ODEs or continuous cellular automata.
That's true, but we already know that a bunch of stuff about the universe is quantized. The question is whether or not that holds true for everything or rather not. And all 'fully continuous models of computation' in the end rely on a representation that is a quantized approximation of an ideal. In other words: any practical implementation of such a model that does not end up being a noise generator or an oscillator and that can be used for reliable computation is - as far as I know - based on some quantized model, and then there are still the cells themselves (arguably quanta) and their location (usually on a grid, but you could use a continuous representation for that as well). Now, 23 or 52 bits (depending on the size of the float representation you use for the 'continuous' values) is a lot, but it is not actually continuous. That's an analog concept and you can't really implement that concept with a fidelity high enough on a digital computer.
You could do it on an analog computer but then you'd be into the noise very quickly.
In theory you can, but in practice this is super hard to do.