Hacker News new | past | comments | ask | show | jobs | submit login

“The first important principle of science is to not fool yourself, and you are the easiest person to fool” — Richard Feynman

The challenge is that rationality is often useless approach in complex systems. If I might be a little provocative, even “truth” and “facts” might be ill-defined concepts in complex systems (how will you establish controls? How big a sample must you run randomizes controlled trials on? What if that’s much bigger than feasible/possible due to exponential scaling? Not possible even in principle!)

A fundamental problem is the human tendency to try and optimize outcomes (using available rationality) — because we invariably overfit to temporary/local optima at the cost of the long term. To counter this failure mode requires sacrificing optimality (always defined by a proxy metric which holds only temporarily) for plurality/diversity. It is understood/expected that this deep principle has numerous manifestations in biology and culture (eg: sexual reproduction, enforced pseudo randomness, etc).

There is another “meta rational” idea that to understand things and do science, you have to first “survive”. Science only gives a reasonable guarantee of eventual correctness. Nobody guarantees that it is the best guide to live your (finite) life by. In Newton’s times, theology and alchemy were considered as promising (if not more) than physics (natural philosophy). George Washington’s doctors recommended blood-letting (SOTA medical technology of the day). What will humans two hundred years from now say about our current views?

The most important principle is to iteratively take insured risks and keep learning. If you hang around long enough doing that, knowledge will accumulate and compound.

Eg, see:

1. https://slatestarcodex.com/2019/06/04/book-review-the-secret...

2. https://www.edge.org/conversation/nassim_nicholas_taleb-unde...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: