Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I suppose with an LLM you could never know if it is hallucinating a supposed system prompt.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: