Hacker Newsnew | past | comments | ask | show | jobs | submit | slop_sommelier's commentslogin

I wonder if these common sense failure modes would persist if LLMs left the internet, and walked around.

Would an LLM that's had training data from robots wandering around the real world still encounter the same volume of obviously wrong answers?

Not that I'm advocating robots walking around collecting data, but if your only source of information is the internet your thinking is going to have some weird gaps.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: