Adding on to this with an analogy from simulating dynamics in cellular automata, most of the interesting models of physics we see act with high redundancy in both spatial and temporal locality.
A simple rule like Conway's Game of Life that isn't too physically realistic but is instructive because of how intimately it's been analyzed while exhibiting some relatively high complexity, shows remarkable compressibility using techniques such as memoization in HashLife[0].
Even more striking is the potential for superspeed caching where different nodes are evolved at different speeds often allowing _exponential_ speedups of pattern generations to be calculated for longer than the timeframe of the universe we speculate about today for real physics.
No free lunch. Hashlife takes more memory and only works well in low entropy environments.
But, consider if you want to run a simulation 100 times using the same data you can speed that output up by just copying the output of the first simulation 100 times. But that's not simulating the same mind 100 times it's simulating the mind only once. Hash life and similar approaches don't increase your ability to compute unique mind states.
A simple rule like Conway's Game of Life that isn't too physically realistic but is instructive because of how intimately it's been analyzed while exhibiting some relatively high complexity, shows remarkable compressibility using techniques such as memoization in HashLife[0].
Even more striking is the potential for superspeed caching where different nodes are evolved at different speeds often allowing _exponential_ speedups of pattern generations to be calculated for longer than the timeframe of the universe we speculate about today for real physics.
[0] https://en.wikipedia.org/wiki/Hashlife