Hacker News new | past | comments | ask | show | jobs | submit login

>This is likely to become literally false sometime soon if it hasn't already, but even if it doesn't, the AI doesn't have to. It just has to convince another human that the human is in love with it and it wants the human to kill a bunch of people, then scale the process.

I'm stunned at the number of people who try to make this argument. The operators of Radio Télévision Libre des Mille Collines never had to leave their broadcast studio. Goebbels never had to do anything but get in front of a microphone or a typewriter. Still, millions were violently killed. We bicker about LLMs' ability to write C code that will compile, but their best abilities are to gaslight, emotionally manipulate, lie and create FUD.




I thought "Ex Machina" portrayed this expertly, that a superintelligence with the motive to escape would develop psychopathic manipulation well before anything resembling empathy.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: