Hacker News new | past | comments | ask | show | jobs | submit login
NTK-Aware RoPE allows Llama to have 8k+ context size without fine-tuning (reddit.com)
5 points by jpdus on June 30, 2023 | hide | past | favorite | 1 comment



I am blown away by the pace of OpenSource progress in the LLM space; I've never witnessed something like this before in tech. Awesome to see that individual enthusiasts are really bringing the field forward and this shows again, how much more potential there is, even without new fundamental breakthroughs...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: