Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
Show HN: ELiTA: Linear-Time Attention Done Right
(
github.com/lahmacunbear
)
2 points
by
acosharma
on Aug 25, 2023
|
hide
|
past
|
favorite
A novel Transformer architecture that is much cheaper and faster, while matching and outperforming the standard. Sequence Lengths of 100K+ on 1 GPU. Intuition, evaluation and code available on repository.
Join us for
AI Startup School
this June 16-17 in San Francisco!
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: