Hacker News new | past | comments | ask | show | jobs | submit login

I'm no mathematician or physicist but have often wondered of Godhart's Law[0] during testing of complex distributed systems. If we consider that Quantum Theory also says qbits are both off/on at the same time until measured, one might conclude the result of any data-science experiment becomes outdated as soon as it is performed.

Aren't all such (big-)data models (e.g. predicting natural disasters, stock-markets, global-warming, effects on GMO, ...) an attempt on measuring "real life"? And so can not be measured accurately because as soon as you take your snapshot data points and events inside the model will have moved on to a new state? Not that we shouldn't try to build these theories but accept them as theories or "soft-science" and not facts.

As soon as we have simplified a complex hard to understand system into a chart we tend to ignore that these "facts" represent at best a snapshot of reality (one version of it) from a specific point in time but certainly not the future.

[0]https://en.wikipedia.org/wiki/Goodhart%27s_law




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: