
Entropy can be used to understand systems - acgan
https://acgan.sh/posts/2019-01-01-maximum-entropy.html
======
westurner
Maximum entropy:
[https://en.wikipedia.org/wiki/Maximum_entropy](https://en.wikipedia.org/wiki/Maximum_entropy)

Here's a quote of a tweet about a (my own): comment on a schema:BlogPost:
[https://twitter.com/westurner/status/1048125281146421249](https://twitter.com/westurner/status/1048125281146421249):

> _“When Bayes, Ockham, and Shannon come together to define machine
> learning”[https://towardsdatascience.com/when-bayes-ockham-and-
> shannon...](https://towardsdatascience.com/when-bayes-ockham-and-shannon-
> come-together-to-define-machine-learning-96422729a1ad) _

> _Comment: "How does this relate to the Principle of Maximum Entropy? How
> does Minimum Description Length relate to Kolmogorov Complexity?"_

~~~
acgan
Thanks for sharing! Despite the fact that Shannon's "A Mathematical Theory of
Communication" is so accessible, I find that most in our field (stats/ML)
don't often think through information-theoretic tools in a "first principles
way."

Yes, KL divergences show up everywhere, but they are not derived from scratch
often enough. Maybe I'm stifled by my campus bubble though :)

