Skip Navigation

Grok's Woke (Grok AI weights are released)

x.ai Open Release of Grok-1

We are releasing the weights and architecture of our 314 billion parameter Mixture-of-Experts model Grok-1.

Obviously done so Elon looks like less of a hypocrite in his quixotic OpenAI lawsuit. However, it is notable in that it's the largest LLM to date with open and commercially licensed weights (314B params). It's way too large for any consumer to actually run, but having direct access to a model this big may benefit researchers looking into safety and bias in AI.

1
1 comments