Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
eggdaft
6 months ago
|
parent
|
context
|
favorite
| on:
How LLMs Work, Explained Without Math
The context comes from the attention mechanism, not from word embeddings.
mjburgess
6 months ago
[–]
Run attention on an ordinal word embedding and see what happens
eggdaft
6 months ago
|
parent
[–]
Well yes, necessary but not sufficient, obviously.
Consider applying for YC's W25 batch! Applications are open till Nov 12.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: