Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Here's his central point:

>This event is significant because it is major demonstration of someone giving a LLM a set of instructions and the results being totally not at all what they predicted.

Replace LLM with computer in that sentence, is it still novel? Laughably far from it, unexpected results are one of the defining features of moderately complex software programs going all the way back to the first person to program a computer. Some of the results are unexpected, but a lot are not, because it's literally doing what the prompt injection tells it to. Which isn't all that surprising but sure anyway...

>Obviously it's still very theoretical and can't do anything like that, but the point is more that perhaps Google doesn't have the culture necessarily to truly interrogate their actions.

Oh that's definitely true.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: