Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think that is one of the most frustrating issues I currently face when using LLMs. One can send the same prompt in two separate chats and receive two drastically different responses.




It is frustrating that it’ll still give a bad response sometimes, but I consider the variation in responses a feature. If it’s going down the wrong path, it’s nice to be able to roll the dice again and get it back on track.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: