Particle.news
Download on the App Store

Qualcomm Launches AI200 and AI250 Inference Chips, Secures 200 MW Saudi Deployment

Analysts frame the bet as a diversification play that will test Qualcomm against entrenched GPU rivals.

Overview

  • Qualcomm introduced the AI200 and AI250 accelerators for data‑center inference, with commercial availability targeted for 2026 and 2027 respectively.
  • Riyadh-based Humain agreed to deploy 200 MW of Qualcomm rack solutions starting in 2026, a deal Wells Fargo estimates could be worth about $2 billion.
  • The products emphasize rack‑scale inference using Qualcomm’s Hexagon NPU technology, with the AI250 claiming up to 10x memory‑bandwidth gains versus current‑generation accelerators.
  • Qualcomm shares jumped intraday by roughly 20% to new highs after the announcements, and were modestly higher in early Wednesday trading.
  • Bank of America reiterated a Buy with a $200 target while Wells Fargo kept an Underweight at $140, citing limited near‑term revenue visibility and intense competition, with one estimate pegging non‑GPU accelerators at about $114 billion by 2030.