Overview
- Dataconomy reports that soaring AI workloads are draining RAM into data centers, a trend some industry participants dub “RAMaggedon,” with gaming innovation seen at risk.
- Samsung and SK Hynix are positioned to dominate next‑generation HBM4 supply for Nvidia’s Vera Rubin platform, intensifying competition for limited high‑bandwidth memory.
- Developers face potential limits on world‑building scope and technical progress, and some studios are meeting fan pushback to generative‑AI features in games.
- Vendors have shifted focus from consumer DRAM to higher‑margin HBM for AI, while distributors ration hardware and cloud providers pass on costs through pricing or usage controls.
- Analysts cited by TechTarget say relief may take up to two years, prompting enterprises to lean on cloud, stockpile components, or purchase used hardware to keep AI plans on track.