Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In the context of a discussion on whether LLMs could have a theory of mind? I think the ability to know anything at all matters to evaluate that conclusion.

More generally, what an LLM actually knows or understands is important if you're considering using one for anything other than generating first drafts which will be fact checked by humans.



If you're depending on fact checking by any one human I think that the last few years in politics should be a sufficient warning to the dangers of that. In the end the LLM will have to be integrated into larger systems that cross check each other.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: