Hacker News new | past | comments | ask | show | jobs | submit login

This makes it worse IMO. I was starting to think it didn’t have a letter by letter representation of the tokens. It does. In which case the fact it didn’t decide to use it speaks even more towards its unsophistication.

Regardless, I’d love if you would explain a bit more why the transformer internals make this problem so difficult?






When Can Transformers Count to n?

https://arxiv.org/html/2407.15160v2

The Expressive Power of Transformers with Chain of Thought

https://arxiv.org/html/2310.07923v5

Transformer needs to retrieve letters per each token while forced to keep internal representation still aligned in length with the base tokens (each token also has finite embedding, while made out of multiple letters), and then it needs to count the letters within misaligned representation.

Autoregressive mode completely alleviate the problem as it can align its internal representation with the letters and it can just keep explicit sequential count.

BTW - humans also can't count without resorting to sequential process.


Thanks!



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: