Particle.news
Download on the App Store

Alphabet Steps Up AI Buildout With A5X Rollout and Reported Marvell Chip Talks

The strategy seeks lower compute costs through custom silicon, with 2026 capex guided to $175–$185 billion.

Overview

  • Google introduced its A5X AI infrastructure, which uses Nvidia rack-scale systems to deliver 10 times more computing power than the prior generation and can link up to 960,000 GPUs across sites.
  • Reports from The Information, cited by Reuters, say Google is in talks with Marvell to make two AI chips, including a memory processing unit to support its tensor processing units and a new TPU design.
  • Earlier this month Google Cloud unveiled TPU v8, with separate 8t chips for training and 8i chips for inference, a step analysts say can cut the cost of running large AI models versus relying only on third-party GPUs.
  • Alphabet guided full-year 2026 capital spending to $175 billion to $185 billion to expand AI compute for DeepMind research, Google services, and rising Google Cloud demand.
  • Berkshire Hathaway disclosed a 2025 purchase of 17.85 million Alphabet shares and Pershing Square held 6.1 million shares as of Q4 2025, while Jim Cramer said he sees the stock reaching $400.