
Multivariate Dependence Beyond Shannon Information [pdf] - cs702
http://csc.ucdavis.edu/~cmg/papers/mdbsi.pdf
======
cs702
OP here. If you haven't seen this paper before, take a look at Tables I and
II, which show two simple, discrete, probability distributions for three
variables, each of which consists of two bits (i.e., each variable can take
four possible values, 00, 01, 10, and 11). In each distribution, the three
variables depend on each other.

The distributions are different, but if you take a look at figures 1, 2, and
3, you will see that their Shannon Entropy diagrams are the same, and that a
directed acyclic graph _cannot_ represent either distribution.

Let me repeat that: today (1) there is no consistent way to measure the
information of the three variables in these toy distributions, and (2) no
Bayesian model can measure their dependencies. In short, probability theory
cannot cope with these toy distributions.

Wow.

