Hacker News new | past | comments | ask | show | jobs | submit login

No current LLM understands words, nor letters. They all have input and output tokens, that roughly correspond to syllabes and letter groupings. Any kind of task involving counting letters or words is outside their realistic capabilities.

LLMs are a tool, and like any other tool, they have strengths and weaknesses. Know your tools.




I understand that, but the article we are discussing points out that LLMs are so good on many tasks, and so good at passing tests, that many people will be tricked into blindly "taking their word for granted" -- even people who should know better: our brain is a lazy machine, and if something works almost always it starts to assume it works always.

I mean, you can ask an LLM to count letters in thousand of words, and pretty much always it will come with the correct answer! So far I don't know of any word other than "банан" that breaks this function.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: