Overview
- MTIA 300 is already in production, powering Meta’s ranking and recommendation systems across Facebook and Instagram.
- MTIA 400 has completed testing and is on track for data center deployment, with rack designs that group 72 devices and incorporate liquid cooling and scale‑up networking.
- MTIA 450 and MTIA 500 are slated for mass deployment in 2027 with substantially more high‑bandwidth memory to handle generative AI inference, and they are not intended for training giant language models.
- Meta is targeting a roughly six‑month release cadence using modular chiplets and a shared chassis, rack, and network infrastructure to speed upgrades and improve cost efficiency at scale.
- Even with MTIA, Meta is locking in external capacity through multiyear Nvidia and AMD purchases and a Google TPU rental agreement, alongside projected 2026 capital spending of $115–$135 billion.