
The First Law of Complexodynamics (2011) - albertzeyer
http://www.scottaaronson.com/blog/?p=762
======
albertzeyer
I often wondered that entropy is often not the right measure which you want
and something like what he describes as complexity is actually what you want.
It's the first time that I have heard of this concept and that it's formally
defined as sophistication or logical depth. The only downside is that this is
not computable.

This lead me to the question, whether there is a simpler variant of complexity
(with similar properties) which is computable:
[https://math.stackexchange.com/questions/2387690/simple-
comp...](https://math.stackexchange.com/questions/2387690/simple-computable-
approximate-complexity-sophistication-for-a-string)

~~~
jpfed
The approach I've long thought would bear fruit turns out to be a dumbed-down
version of Aaronson's.

Imagine a "meta-sequence" whose elements are features of the original sequence
in question, then take the entropy of the meta-sequence. Different features
yield different kinds of "interestingness".

For example, you could form your meta-sequence with a histogram of the
original sequence.

Original: A B A A B B A B

Meta: 4 4 (4 As, 4 Bs)

Entropy of the meta sequence is low.

But instead of just looking at the values of the PMF or PDF, you might want to
look at things like transition probabilities between different elements in the
original sequence to capture the fact that

A A A A B B B B and A B A B A B A B

have more boring transition probabilities than

A A B A B A B B

And so on; one can imagine having an ever-more-sophisticated array of features
to look for when forming the meta-sequence. But Kolmogorov complexity seems to
cut more directly to the heart of the matter.

If I'm specifically looking for something computationally feasible, maybe I
would do a weighted sum of the entropies of k-order transition matrices, from
k=1 to some limit.

------
mannykannot
I think it looks a lot less paradoxical if you ask "why do interesting things
(seem to) have 'intermediate' levels of entropy?"

It is not exactly the same issue, because the original formulation suggests
that everything with an 'intermediate' entropy is interesting, which appears
to be a tougher case to make, though I imagine there's some definition of
'intermediate' and 'interesting' from which an argument may be made.

------
briantakita
> Entropy increases monotonically from left to right, but intuitively, the
> “complexity” seems highest in the middle picture: the one with all the
> tendrils of milk.

Complexity only increases from the perspective of the observer. If you shift
the frame of reference to individual particles, you have more local complexity
throughout the system when the cream is completely mixed into the coffee (the
third stage); even if there are less interesting aspects to the overall system
from the perspective of the outside observer.

This indicates that complexity is not the term we are looking for to describe
the law/phenomenon, as complexity is a factor of reference.

> But today, in between, the universe contains interesting structures such as
> galaxies and brains and hot-dog-shaped novelty vehicles.

I find Sophistication, "interesting", occurrence of novelty, or distinct
structures to be a more accurate way to describe the three glasses of coffee &
cream. Novelty is also a factor in intelligence, as is coherence.

------
schiffern
I thought that sounded familiar. Geoffrey West gave a talk there, delivering a
much better explanation of his research on socio-biological scaling, growth,
and sustainability than thepop sci treatments occasionally posted here.

[https://www.youtube.com/watch?v=DFFVSvAr7Wc](https://www.youtube.com/watch?v=DFFVSvAr7Wc)

