Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

An intelligence that smart would likely understand what you are saying and what you want. That doesn't mean the AI would want to do what you tell it to do though.

I haven't yet seen any evidence that a concept like "wanting" or "desire" have any meaning outside the context of humans.

I agree that if we could produce an AI with various blind methods, it would likely be a dangerous thing.

I simply also doubt we could produce an AI in this fashion. I mean, you couldn't train functional human by putting him/her in room with just rewards and punishment.

I would note that even the animals of the natural world are constantly using signs to communicate with each other and other functional mammals receive a good of "training" over time.



>I haven't yet seen any evidence that a concept like "wanting" or "desire" have any meaning outside the context of humans.

Those specific feelings/emotions, no. But AIs do have utility functions, or in the case of reinforcement learning, reward and punishment signals (which itself is essentially a utility function.)

>I simply also doubt we could produce an AI in this fashion. I mean, you couldn't train functional human by putting him/her in room with just rewards and punishment.

Possibly. It's just an example to illustrate how difficult the problem of coding abstract, high level goals into an AI is.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: