Hacker News new | past | comments | ask | show | jobs | submit login

>> For some reason humans have a weird, irrational quirk where they miscalculate uncertain risks by assuming that the outcomes will fall on one side of the distribution until it becomes overwhelmingly obvious that they won't.

And this quirk may be the end of us - e.g. don't react to AIs getting smarter and smarter until it's too late to do anything.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: