xAI's Grok 5 Trains on World's Largest AI Supercomputer as 10 of 12 Founders Exit
Key Takeaways
- ▸Colossus 2 represents an unprecedented computational milestone with 555,000 NVIDIA GPUs in a 1-gigawatt cluster, with plans to scale to 1.5GW by April 2026
- ▸Ten of xAI's twelve original founders have left since March 2023, creating significant organizational questions despite infrastructure advantages
- ▸Grok 5 is a 6-trillion-parameter Mixture-of-Experts model currently in training, with public beta release expected Q2 2026
Summary
xAI is training Grok 5, a 6-trillion-parameter model, on Colossus 2—the world's first gigawatt-scale AI supercomputer housing over 555,000 NVIDIA GPUs across Memphis, Tennessee. The infrastructure represents an $18 billion GPU investment and represents unprecedented computational scale for AI model training. However, this technical achievement exists against a backdrop of significant organizational turbulence: 10 of xAI's original 12 founders have departed since March 2023.
Elon Musk acknowledged in March 2026 that "xAI was not built right first time around, so is being rebuilt from the foundations up"—a statement made while Tesla simultaneously invested $2 billion in xAI's Series E funding round. Grok 5 reportedly has a 10% probability of achieving AGI according to Musk's claims, with training having begun in late 2025 and a public beta release expected between March and April 2026.
The tension between xAI's extraordinary computational resources and its significant loss of founding talent raises a critical question: whether raw compute power can compensate for the institutional knowledge and organizational expertise that departed with the founder exodus. xAI is also pursuing ambitious long-term infrastructure plays, including the Tesla Terafab project and plans for AI data centers in low Earth orbit through its merger with SpaceX.
- xAI is pursuing space-based AI data centers through SpaceX integration, betting on long-term cost parity for orbital infrastructure by 2035
Editorial Opinion
xAI's Colossus 2 supercomputer is genuinely world-class infrastructure—the scale is staggering and the long-term vision of orbital AI data centers is audacious. Yet the founder exodus casts a shadow that no amount of GPU count can fully eliminate. Compute is necessary but not sufficient for breakthrough AI development; the institutional knowledge, research intuition, and organizational culture built by founding teams matter enormously. Whether Grok 5 can compete with Claude, GPT-5, and Gemini may ultimately depend less on NVIDIA's inventory than on whether xAI can rebuild the human capital it lost.



