Overview
- Nvidia, in a GTC 2026 session, showed a Tuscan Villa scene drop from about 6.5GB of VRAM to 970MB with image quality holding steady.
- The method stores each texture as compact learned data and a small GPU network reconstructs pixels on demand in a deterministic way.
- Decoding runs on matrix engines such as Tensor Cores rather than the main graphics units, which helps keep the render pipeline clear.
- Nvidia also presented a related Neural Materials demo that reduced material data and sped up shading by up to 7.7x at 1080p.
- The NTC SDK is available in beta on GitHub and rivals are aligning under DirectX standards, though no released games support these systems yet.