Alibaba Admits Its 470,000 AI Chips Are Inferior to Rivals, Betting on Software Stack Integration to Compete
Key Takeaways
- ▸Alibaba has shipped 470,000 AI chips but admits they underperform Nvidia and AMD products in raw performance metrics
- ▸The company is betting on vertical integration—combining inferior chips with optimized cloud infrastructure and Qwen models—to compete on cost-effectiveness rather than raw performance
- ▸Custom silicon provides Alibaba with supply chain independence amid US export restrictions on advanced AI accelerators to China
Summary
Alibaba has manufactured 470,000 AI chips through its T-Head chipmaking division and publicly acknowledged they currently lag behind competitors from Nvidia and AMD in performance. CEO Yongming Wu revealed the figure during the company's Q3 2026 earnings call, stating that Alibaba's chips—likely referring to the Zhenwu 810E accelerator that debuted in January—cannot match foreign counterparts across various performance metrics.
Rather than viewing this as a fundamental weakness, Alibaba is adopting a differentiated strategy: optimizing its entire cloud infrastructure and Qwen AI models around its custom silicon to deliver superior cost-effectiveness and value for money. Wu emphasized that this co-design approach is what sets T-Head apart from traditional chip companies. The strategy also provides Alibaba with supply chain security amid US export bans on advanced accelerators targeting China, and positions the company to lower AI inferencing costs as demand grows.
Alibaba Cloud continues to drive the company's growth, with quarterly revenue up 36% year-over-year to $6.2 billion, and the company projects it can reach $100 billion in annual cloud and AI revenue within five years. Wu noted a 6x increase in token consumption on Alibaba Cloud's model studio platform over the past six months, signaling accelerating demand for AI services that could support aggressive expansion targets.
- Alibaba Cloud is the company's only fast-growing segment with 36% YoY revenue growth; the company targets $100 billion annual cloud/AI revenue within five years
Editorial Opinion
Alibaba's candid admission that its chips lag behind Nvidia and AMD is refreshingly honest, but also revealing of the structural disadvantages Chinese chipmakers face in competing globally. By pivoting to a 'full-stack optimization' strategy, Alibaba acknowledges a hard truth: raw silicon performance may not be the decisive factor if software, models, and infrastructure can be tuned together for cost and efficiency. This approach mirrors how some semiconductor startups have competed against incumbents, but it requires Alibaba to execute flawlessly across multiple layers simultaneously—a high-stakes bet that cloud and AI demand will grow fast enough to offset the performance gap.



