For decades, the idea that the brain functions like a digital computer has shaped cognitive science, artificial intelligence, and even our understanding of consciousness itself. But a new paper by Andrew F. Knight challenges this comparison in a way that feels less like a philosophical debate and more like a mathematical takedown.
His argument? If consciousness operates through distinct, sequential experiences—each requiring vast amounts of information—the brain simply doesn’t have the storage capacity to function as a digital system. If that’s true, the implications stretch far beyond AI models and cognitive science: it might mean that our understanding of the mind itself is fundamentally flawed.
The brain’s memory problem
Knight’s argument hinges on an information-theoretic analysis of conscious experience. He starts with a simple but provocative premise: if two conscious states are subjectively different, they must also be physically different in the brain. This means that for every distinguishable moment of awareness, there must be a corresponding brain state—encoded, stored, and recalled in some way.
But here’s the problem: when you crunch the numbers, the required memory space vastly exceeds what the brain could ever hold. Knight calculates the minimum bit-length necessary to encode a single distinguishable sensory experience (a “stimulus frame”) across all sensory modalities—sight, sound, touch, taste, and smell. He conservatively estimates that one second of conscious experience requires about 200,000 bits of data. Now multiply that by the number of moments you experience in a lifetime, and the brain’s storage demand explodes into the quadrillions of bits—several times beyond even the most optimistic estimates of its neural capacity.
The takeaway? If the brain were a classical digital system, it wouldn’t be able to store your own life experience. Your awareness would hit a bottleneck long before your memory ever did.
Why AI can’t (yet) decide your cancer treatment
The fatal flaw in computational theories of mind
Beyond the sheer volume of information, Knight introduces another dimension: history-dependence. Consciousness isn’t just a sequence of independent frames—it builds upon prior experiences in a nested, recursive manner. What you experience now is influenced not just by your immediate past but by an entire history of perceptions and contextual associations.
Think of it like watching a movie. If you suddenly jump to the last scene without watching the rest, you’ll perceive it very differently than if you had followed the narrative from the start. In the same way, conscious states are deeply embedded in prior states—creating an exponential increase in the amount of information required to represent them. The more you experience, the more layered and context-dependent your future experiences become.
For a digital system to replicate this, it would need to store not just the raw data of past experiences but also their relational structure—the way each moment interacts with prior ones. This recursive embedding sends storage demands skyrocketing, reinforcing Knight’s central claim: the brain simply cannot function as a classical computational device.
If Knight is correct, the idea that the mind can be reduced to an algorithm—or that AI can achieve human-like general intelligence through classical computing—starts to look shaky. This isn’t a critique of AI’s ability to perform intelligent tasks. Machine learning models already outperform humans in pattern recognition, prediction, and optimization. But if consciousness requires an information structure that classical computation can’t support, then no amount of data or processing power will make AI sentient.
This raises deeper questions:
- Are we even asking the right questions about AI? If consciousness is not computational, focusing on bigger models and more data might be a dead end.
- Does the brain rely on a different kind of computation? Some researchers speculate that quantum mechanics, analog processing, or other yet-unknown principles might be at play.
- Is neuroscience missing something fundamental? If memory and perception function in ways that exceed classical computation, then understanding the brain might require an entirely new framework.
Knight’s argument is compelling, but it’s also a challenge to long-held assumptions in cognitive science. The brain-as-computer metaphor has driven research for decades, and shifting away from it means confronting an unsettling reality: we still don’t know what kind of machine the brain actually is.
If the brain isn’t digital, then what is it? The search for an answer may redefine how we think about consciousness, intelligence, and what it means to be human.
Featured image credit: Kerem Gülen/Imagen 3