Particle.news
Download on the App Store

Micron Ships First 256GB SOCAMM2 LPDRAM Samples for AI Data Centers

JEDEC support together with an NVIDIA endorsement signals momentum for a low‑power, high‑capacity memory shift.

Overview

  • Micron began shipping customer samples of a 256GB SOCAMM2 module that targets AI and HPC servers with higher-capacity, low-power LPDRAM.
  • The module is enabled by a monolithic 32Gb LPDDR5X die and is described as allowing up to 2TB of LPDRAM per 8‑channel CPU.
  • Micron claims one‑third the power draw and one‑third the footprint versus equivalent DDR5 RDIMMs, with a horizontal, serviceable design compatible with liquid cooling.
  • Company tests report more than 2.3x faster time‑to‑first‑token for long‑context LLM inference and over 3x better performance‑per‑watt in CPU HPC workloads, pending independent validation.
  • NVIDIA publicly endorsed the approach, and industry analysis reports SOCAMM2’s rapid move into a JEDEC-backed standard with offerings from major memory vendors, with a showcase planned for GTC 2026.