Human Brain Cells on a Chip Master Doom in One Week, Advancing Biological Computing
Key Takeaways
- ▸Cortical Labs' human brain cells on a chip learned to play Doom in one week using a new Python programming interface, dramatically reducing the time and expertise required compared to their 2021 Pong demonstration
- ▸The biological chip used approximately 200,000 neurons and learned significantly faster than traditional silicon-based AI systems, though it performed below human levels
- ▸The breakthrough brings biological computers closer to practical applications like controlling robotic arms, as the complexity of Doom better simulates real-world control challenges
Summary
Australian biotech company Cortical Labs has achieved a significant milestone in biological computing by teaching human brain cells grown on a chip to play the classic video game Doom in just one week. The breakthrough follows the company's 2021 demonstration of neuron-powered chips playing Pong, but represents a major leap in accessibility and complexity. Using approximately 200,000 living brain cells integrated with microelectrode arrays, an independent developer programmed the biological system using Python—a process that took only days compared to years of painstaking work required for the earlier Pong demonstration.
The neuronal chip outperformed random play but fell short of human-level performance. However, researchers emphasize the system's remarkable learning speed compared to traditional silicon-based machine learning systems. Brett Kagan of Cortical Labs highlighted that the development of a Python interface has democratized access to biological computing, allowing developers without extensive biology expertise to program these living systems. The chips function by sending and receiving electrical signals through microelectrode arrays, enabling the neurons to interact with digital environments.
Experts in the field view this advancement as a crucial step toward practical applications of biological computers. The ability to handle Doom's complexity—including real-time decision-making, uncertainty, and multiple simultaneous variables—demonstrates capabilities that could translate to real-world tasks such as controlling robotic arms. Researchers at the University of Reading are already working on similar applications using hydrogel-based biological computers. While fundamental questions remain about how these neurons process visual information and understand objectives without traditional sensory organs, the rapid progress suggests biological computing may soon move from laboratory curiosity to functional technology, potentially offering unique advantages in processing information that silicon-based systems cannot replicate.
- The new Python interface democratizes access to biological computing, allowing developers without deep biology expertise to program living neural systems
- Experts emphasize these biological processors offer unique information processing capabilities that cannot be replicated in silicon, opening new possibilities for hybrid computing systems
Editorial Opinion
This development marks a pivotal moment in the convergence of biology and computing, transforming biological neural networks from research curiosities into potentially programmable substrates. The dramatic reduction in development time—from years to days—through software abstraction suggests we're approaching a tipping point where biological computing could become a mainstream tool rather than an exotic laboratory technique. However, the fundamental mysteries of how neurons 'see' and 'understand' digital environments without traditional sensory mechanisms highlight how much we still have to learn about interfacing biological and digital systems.



