Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

LLMs don't really exist physically (except in the most technical sense), so point is kind of moot and obvious if you accept this particular definition of a feeling.

LLMs are not mammals nor animals, expecting them to feel in a mammalian or animal way is misguided. They might have a mammalian-feeling-analog just like they might have human-intelligence-analog circuitry in the billions (trillions nowadays) of parameters.





Yes, I think we’re agreeing?



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: