Google is building a four-partner chip supply chain to challenge Nvidia in AI inference

Google Builds Diverse Chip Supply Chain to Challenge Nvidia in AI Inference

Google is building the AI industry’s most diversified custom chip supply chain, with four design partners (Broadcom, MediaTek, Marvell, Intel) and a roadmap stretching from the Ironwood TPU now shipping in the millions to TPU v8 chips at TSMC 2nm in late 2027.

Summary

Google is assembling a diverse supply chain for its custom silicon, aiming to challenge Nvidia’s dominance in AI inference. This strategy includes:

  • Four Design Partners: Broadcom, MediaTek, Marvell, and Intel.
  • Fabrication Relationship: With TSMC.
  • Product Roadmap: Extending from current inference chips to 2nm processors in late 2027.

The key pieces are:

  • Ironwood TPU: Google’s seventh-generation TPU designed for inference, offering ten times the performance of its predecessor and scalable up to 9,216 liquid-cooled chips.
  • Sunfish (Broadcom): Next-generation TPU v8 training chip for TSMC's 2nm node.
  • Zebrafish (MediaTek): Cost-optimized inference variant of TPU v8.
  • Potential Marvell Addition: Discussions to add a memory processing unit and an additional inference TPU.

Details from Bloomberg Feature

The Bloomberg feature highlights the following:

  • Ironwood is now generally available to Google Cloud customers, with plans to produce millions of units this year.
  • Meta has a rental arrangement for TPUs.
  • Broadcom commands over 70% of the custom AI accelerator market and projects $100 billion in AI chip revenue by 2027.
  • MediaTek’s involvement began with I/O modules on Ironwood, offering 20-30% cost savings.
  • Google aims to leverage each partner's knowledge of the others to gain negotiating leverage.