Overview
- Google’s TurboQuant claims to cut model memory needs by about six times and speed up training by up to eight times, and the news knocked Micron and Sandisk shares lower on fears of weaker demand for high-bandwidth memory.
- Some analysts cite the Jevons paradox to argue lower effective memory costs could spur more AI usage, with Micron’s HBM market estimate rising from about $35 billion in 2025 to roughly $100 billion by 2028.
- Investors are rotating out of high-growth tech as software stocks are repriced for AI-driven disruption, with a major software ETF down about 25% this year and the Magnificent Seven off roughly 11% on average.
- Nvidia still holds roughly 90% share in AI GPUs and reports data center chips selling out, yet its leaders warn results hinge on cloud spending levels and the risk that big customers shift to custom chips.
- Retirement portfolios heavily loaded with large tech now face sequence-of-returns risk during the drawdown, and advisors are split between cutting tech exposure for safety and buying the dip for long-term gains.