Overview
- Cortical Labs released a demo showing about 200,000 living human neurons on a CL1 multi‑electrode array learning to play Doom.
- The system converts on‑screen events into spatial patterns of electrical stimulation, with neural spikes decoded as actions such as moving or shooting.
- Researchers say the cultures exhibit adaptive, goal‑directed behavior, though performance remains at a novice level rather than expert play.
- Independent collaborator Sean Cole built the working Doom interface using the Cortical Labs API and reports training the setup in less than a week via the cloud platform.
- The demonstration builds on an earlier Pong experiment and helped drive the creation of the Cortical Cloud for more complex training tasks.