Hacker News new | past | comments | ask | show | jobs | submit login
Grok Weights Available via Torrent (twitter.com/grok)
27 points by kaliqt 63 days ago | hide | past | favorite | 9 comments



It's a 314 billion parameter mixture-of-experts, apparently trained using a custom framework using Rust and Jax.


So this isn't something anyone without H100s is ever running no matter how you quantize it.


As promised, weights are available. For those who don't like Twitter / X, Here are the details from the README:

- 314B parameter Mixture of Experts model

- Base model (not finetuned)

- 8 experts (2 active)

- 86B active parameters

- Apache 2.0 license

- Code: https://github.com/xai-org/grok

- Weights: https://github.com/xai-org/grok?tab=readme-ov-file#downloadi...

Even more open than OpenAI.


How many H100 does one need to run this?


8.


How many H100 does one need to run this?


Bold to make a joke about the tsunami of spam overrunning the site.



They made good on their promise it would seem.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: