Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Citing sources is not a magic that makes what you say true, it just makes statement more easily falsifiable.

LLMs can cite sources as well as any human, that is with a non-trivial error rate.

LLMs are shit for a lot of things but the problems are with the quality of the output whether they work by magic, soul-bending, matrix multiplication, or whatever is irrelevant.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: