Overview
- Samsung says it is the first to commercialize sixth‑generation HBM4, moving its schedule up by about a week and sending initial batches to major customers, with reports identifying Nvidia as a recipient.
- HBM4 delivers 11.7 Gbps per pin with headroom to 13 Gbps, offering roughly 3.3 TB/s per stack and capacities of 24–36 GB in 12‑high stacks with a roadmap to 48 GB in 16‑high versions.
- The chips pair 1c (10nm‑class) DRAM with a 4nm logic base die, which Samsung says enabled stable yields at the start of mass production without redesigns.
- Samsung highlights lower power use and improved thermals versus HBM3E, citing a 40% efficiency gain plus better thermal resistance and heat dissipation for data‑center deployments.
- Competition is intensifying as Micron discloses high‑volume HBM4 production and shipments and SK hynix vies for orders, while Samsung targets HBM4E sampling in the second half of 2026 and custom HBM samples in 2027.