Hacker News new | past | comments | ask | show | jobs | submit login

I think there was some inevitable loss of nuance as that got translated into layman's terms.

For all practical purposes, you can write causality like

p(x|y) = 1 and time(y) < time(x)

I.e. causality is just when one event always happens after another event. Any additional requirements for causality are basically philosophy.

But typical ML systems don't construct networks of causal relations, is basically what he's getting at from my reading of it




> For all practical purposes, you can write causality like

> p(x|y) = 1 and time(y) < time(x)

This isn't true at all. For a counterexample, x and y may both have a common cause.

Pearl's work is on this and extends the language to talk about p(y | do(x)), meaning that you talk about what happens to y when you take some hypothetical intervention to change x. Causation framed in terms of intervention talks about "what if it had been this instead of that?" and is probably the most common model of causation.

For more info look up the rubin causal model, the potential outcomes framework, and pearls's "do" notation.


What if you replace it with

p(x|y) = 1, p(x|!y) = 0, and time(y) < time(x)

That rules out the rooster counter-example. If y is a boolean, I guess the only thing you can "do" to it is negate it.


This is called Granger-causality (and work on it led to a Nobel prize, so it's important and useful)... it's stronger than just correlation, and way easier to determine than true causation, but it's possible that z causes both x and y, and z's effect on x is just more delayed than its effect on y.

But it at least rules out x causing y, which is something.


> but it's possible that z causes both x and y, and z's effect on x is just more delayed than its effect on y.

This is in fact the case with the barometer falling before a storm. Both the falling barometer and the subsequent rain and wind of a storm are consequences of an uneven distribution of heat and moisture in the atmosphere approaching equilibrium under the constraints of Earth's gravity and Coriolis force.


Still doesn't work. Suppose I flip a coin and write the result in two places. I write it on sheet y then sheet x. We have that X == Y, so p(x|y) = 1, p(x|!y) = 0, and time(y) < time(x), but neither causes the other. I can write more later if you have interest, but I gotta run.


p(sun rises|rooster crows)=1

Is the rooster causing the dawn?


I think you'd have a fair argument if you considered a complement, i.e. P(sun rises|no rooster crowing) = 0.


Except if the rooster is dead or you put the rooster in a dark room.


p(sun rises|rooster crows)=1 remains valid when “rooster crows” is false.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: