Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

How do LLMs do on things that are common confusions? Do they specifically have to be trained against them? I'm imagining a Monty Hall problem that isn't in the training set tripping them up the same way a full wine glass does




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: