Hacker News new | past | comments | ask | show | jobs | submit login

The context comes from the attention mechanism, not from word embeddings.



Run attention on an ordinal word embedding and see what happens


Well yes, necessary but not sufficient, obviously.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: