Hacker News new | past | comments | ask | show | jobs | submit login

If you give that person the ability to represent unknown questions as combinations or functions of ones for which they have the class' answer distribution, then - things get blurrier between being "powerful" and "smart".

To take us back to the age-old Chinese Room story: What does it mean, to _understand_ something? If an entity has internal processes which allow it to converse with humans consistently and reasonably about something (and I'm not saying GPT/LLMs are that) - can you not claim that entity understands it?




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: