Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Such results are inherently limited because a same word can have different meanings depending on context.

The role of the Attention Layer in LLMs is to give each token a better embedding by accounting for context.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: