Micron Scales HBM4 Production for AI as Demand Outlook Strengthens
Investor focus now turns to guidance to show whether tight supply keeps prices and margins high.
Overview
- Micron says its HBM4 36GB 12H is now shipping in volume, with its SOCAMM2 memory modules and Gen6 solid‑state drives also in high‑volume production.
- The company cites tight engineering work with NVIDIA on next‑generation AI systems, positioning HBM4 for GPUs built on the Vera Rubin platform.
- High‑bandwidth memory is stacked DRAM placed next to the processor to feed data at very high rates, which is crucial for training and running large AI models and has been in short supply.
- Yahoo Finance reported Micron posted Q2 2026 revenue of $23.86 billion with non‑GAAP gross margins of 74.9%, and that the company’s 2026 HBM4 supply is already sold out.
- Aletheia Capital projects a sharp capex rise at major cloud providers with a shipment inflection starting in Q2 2026 and faster builds in the second half, while Insider Monkey notes MU shares are up about 565% year over year and 45% year to date.