Public
checkpoints
1 items
About
This repository contains the open-source code and model weights for Grok-1, a massive 314 billion parameter language model developed by xAI. It's a Mixture of Experts (MoE) model released under the Apache 2.0 license, providing researchers and developers with example JAX code to load and run this state-of-the-art AI.
12 files
1 folders
2.21 MB total size
0 open issues
0 open pull requests
0 watchers
0 forks
0 stars
60 views
Updated Jan 21, 2026
Languages
Python
86.9%
Text
12.9%
TOML
0.2%