Hacker News new | past | comments | ask | show | jobs | submit login

Take a look at this: https://intelligence.org/files/CEV.pdf

But also, what in particular makes you think that experiencing human emotions are fundamentally required in order to optimize for human values?

A different mind might have different emotions, or it might have no emotions, since emotions are a kludge that came about by natural selection, and may not be optimal safeguards in ensuring that an intelligence acts to optimize for the values we want.




I think that perhaps none has stopped to consider what is a life without emotions. In the same way that one can't grasp the concept of nothing, for us the concept of not having emotions is impossible to imagine. Perhaps you make hide your emotion in a rationalization but the emotions are there perhaps are the source of your thinking, the fountain of your ideas and vitality.


I am going to read the 38 pages of the Yudkowsky paper you link to, but for starter using the word Friendliness to describe the research for a safety IA that is not going to convert human in slaves is a comical term that don't reflect the risks involved and try to water down the problem.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: