Hacker News new | past | comments | ask | show | jobs | submit login

Feels like we've got to the point of giving LLMs fMRIs just like human brains.



> fMRIs just like human brains.

or dead salmon :)


When LLMs are good enough, they will do the science of understanding LLMs for us. And all other decisions.


“I don't really understand the explanations, but GPT-5, Bard and LLaMa-aLp4ca all come to the same conclusion, so I guess it must be correct.”


still better than most politicians today or people influenced by populistic media.



True, particularly when you consider what fMRIs tell us about a brain’s state of knowledge




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: