- 314B parameter Mixture of Experts model
- Base model (not finetuned)
- 8 experts (2 active)
- 86B active parameters
- Apache 2.0 license
- Code: https://github.com/xai-org/grok
- Weights: https://github.com/xai-org/grok?tab=readme-ov-file#downloadi...
Even more open than OpenAI.