Hacker News new | past | comments | ask | show | jobs | submit login

For sure bigger models are needed to compete with transformer LLM, same thing for Mamba, I was just bothered by the distrust about something very reasonable like not being able to fully train a 70B model.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: