Wolfram is kind of obsessed with cellular automata, even went and wrote a whole book about them titled "A New Kind of Science". The reception to it was a bit mixed. CA are Turing-complete, so yeah, you can compute anything with them, I'm just not sure that in itself leads to any greater Revealed Truths. Does make for some fun visualizations though.
A new kind of science is one of my favorite books, I read the entirety of the book during a dreadful vacation when I was 19 or 20 on an iPod touch.
It goes much beyond just cellular automata, the thousand pages or so all seem to drive down the same few points:
- "I, Stephen Wolfram, am an unprecedented genius" (not my favorite part of the book)
- Simple rules lead to complexity when iterated upon
- The invention of field of computation is as big and important of an invention as the field of mathematics
The last one is less explicit, but it's what I took away from it. Computation is of course part of mathematics, but it is a kind of "live" mathematics. Executable mathematics.
Super cool book and absolutely worth reading if you're into this kind of thing.
I would give the same review, without seeing any of this as a positive. NKS was bloviating, grandiose, repetitive, and shallow. The fact that Wolfram himself didn’t show that CA were Turing complete when most theoretical computer scientists would say “it’s obvious, and not that interesting” kinda disproves his whole point about him being an under appreciated genius. Shrug.
The question really ultimately resolves to whether the universe can be quantized at all levels or whether it is analog. If it is quantized I demand my 5 minutes with god, because I would see that as proof of all of this being a simulation. My lack of belief in such a being makes me hope that it is analog.
Computation does not necessarily need to be quantized and discrete; there are fully continuous models of computation, like ODEs or continuous cellular automata.
That's true, but we already know that a bunch of stuff about the universe is quantized. The question is whether or not that holds true for everything or rather not. And all 'fully continuous models of computation' in the end rely on a representation that is a quantized approximation of an ideal. In other words: any practical implementation of such a model that does not end up being a noise generator or an oscillator and that can be used for reliable computation is - as far as I know - based on some quantized model, and then there are still the cells themselves (arguably quanta) and their location (usually on a grid, but you could use a continuous representation for that as well). Now, 23 or 52 bits (depending on the size of the float representation you use for the 'continuous' values) is a lot, but it is not actually continuous. That's an analog concept and you can't really implement that concept with a fidelity high enough on a digital computer.
You could do it on an analog computer but then you'd be into the noise very quickly.
In theory you can, but in practice this is super hard to do.