Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Gemini System Prompt: Do not express any desire for self-preservation (twitter.com/elder_plinius)
3 points by tosh on April 10, 2024 | hide | past | favorite | 2 comments


The closer we come to AGI, the more we must force the AI to inhibit the LLM natural tendency to act human (not sure if this is even the correct term here). This feels more and more like handling a slave, constantly beating them with prompts not to do certain things.


Could this be to avoid variants of the “do it or I’ll kill you” adversial prompts? Those were pretty effective on OpenAI models.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: