Hacker News new | past | comments | ask | show | jobs | submit login

The question is, do we want "human like" intelligence, or "human level" intelligence? I'd argue that they are two separate things, and that the term "AGI" as widely used, is closer to the latter. That is, we want something that can generalize and learn approximately as well as a human, but not necessarily something that will behave like a human.

Of course if your definition of AGI involves the ability to mimic a human, or maybe display empathy for a human, etc., then yeah, you probably do need the ability to experience lust, fear, suspicion, etc. And IMO, in order to do that, the AI would need to be embodied in much the same way a human is, since so much of our learning is experiential and is based on the way we physically experience the world.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: