A slew of recent articles have likened the human brain to a quantum computer, including one titled, “Your brain might be a quantum computer that hallucinates math” — an article by Tristan Greene at The Next Web. Quantum mechanics has changed, continues to change and will likely still change everything. As the “math that explains matter”, it has played a role in “cell phones to supercomputers, DVDs to PDFs.” 

Reality is neither absolute nor predictable in the microworld: uncertainty is its fundamental condition. In fact, science is far better at predicting odds than assuring outcomes. In 1921 chemist William D. Harkins wrote: “Since it concerns itself with the relations between matter and radiation…[quantum theory]… is of fundamental significance in connection with almost all processes which we know.” 

In 1927 Werner Heisenberg’s revealed “that deterministic cause-and-effect physics failed when applied to atoms.” In fact, he determined it was impossible “to measure both the location and velocity of a subatomic particle at the same time. If you measured one precisely, some uncertainty remained for the other.” This observation heralded the revolutionary change then characterizing physics. 

In many ways, quantum theory frames our physical reality in ways that sound like a philosophical or psychological thought experiment, rather than physics. The observer is inseparable from the observed. Everything seems and behaves as if interconnected even if we don’t know why. You can’t predetermine how a photon will behave; you can only observe it from one perspective and watch as it changes, almost in response to your observation. Things can be off and on at the same time. The ability to be off and on, 0 and 1, at the same time makes things work exponentially faster. If these were human traits, they would indicate refusal to recognize limits, willingness to live with ambiguity and openness to the dynamics of perpetual change. Recent research into the quantum attributes of our brains suggests that quantum traits are human traits — at least that our brains can be accurately described as (slightly slimy and meaty) quantum computers. 

Theories about the quantum nature of the brain have existed for a few decades. For some theoreticians, consciousness is itself a quantum process. For others, quantum concepts are more aptly understood as rhetorical or metaphorical ways of describing consciousness. Still others believe that matter and consciousness are “dual” substances in the same underlying reality. 

Most recently, research squads at the Universities of Bonn and Tübingen have tied “simple processes” to our identities as quantum computers. The researchers found “abstract codes” for processing arithmetic — specifically addition and subtraction — in the brain. They discovered that the neurons firing during addition were different from those firing for subtraction problems, and that different parts of the brain were deployed for diagnoses and solutions to problems. One of the researchers explained: “We found that different neurons fired during additions than during subtractions . . . it is as if the plus key on the calculator were constantly changing its location. It was the same with subtraction.”The study shows that different neurons fire for different cognitive functions, and the brain is capable of learning the difference between those functions. 

In reviewing older research, we find further linkages between human brain activity and quantum mechanics. In learning tasks, recurrent neural networks “not only learn the prescribed input-output relation but also the sequence in which inputs have been presented.” They also learn to interpret what to do “if the sequence of presentation is changed.” Meaning in natural languages transcends the mere matching and calculations of symbols; the entanglement of language is quantum entanglement. 

In fact, the case is so compelling and consistent that it might be worth asking whether the brain is a quantum computer or whether quantum mechanics, our systemic perception and interpretation of quantum reality, is modeled after our own brains? After all, we’ve been able to perform sophisticated and nuanced calculations and synthesize seemingly inconsistent data for as long as our current brains were evolutionarily manifest. The first solid evidence of the existence of counting was about 20,000 years ago: a dead baboon’s fibula bone with a series of symmetrical lines cut into it, and which archaeologists guess was used to tally something. Much later — around 4000 BC and when people were living in cities — that formal number and counting systems — mathematics — was developed to keep track of all the people, animals, and things (for example, foodstuffs). The invention of zero constructed another layer of complexity.

Our brains have long been able to tally and to keep track of things through a demarcation that we can safely call counting, whether we had a language for it as such. Our brains could already calculate, and even understand the idea of nothing without the concept of zero; and, functionally, it could perform calculations as needed. Thus, the question of when we developed numerical systems, and therefore math, is then a material, historical, collective question. 

Math isn’t just counting; and even within counting, the ability to categorize and create sets implies an interpretive power that cannot be reduced to quantification. Somewhere along the way, quantifying becomes qualifying, quantity becomes quality, and that’s where the quantum function comes in — the reality that the whole is more than the sum of its parts, that cause and effect might be blurred and that things can (must) be interpreted and predicted. After all, the presence of this interpretation is the criteria for designating a computer program “artificial intelligence.” 

Finally, consider the set of phenomena we call communication. We interpret and exchange symbols in a web of signification that is as predictive, uncertain and inseparable from perception as photons are. We don’t simply store information and then process, modify and retrieve it. (In fact, oftentimes we need help with such activities, especially when we need to organize large quantities of data. That’s why we might turn to services like Accurate Append’s data appending, which allows organizations, campaigns and businesses to fill gaps in their data.) Rather, we, as humans, figure it out through often-clumsy and always-imperfect communication. Philosopher Jacques Derrida points that all language is “problematic” in a deeply fundamental way, and that this is a “stroke of luck” for those who appreciate communication, because absent that “problematicity,” we would have no reason to speak, to discuss. “How else,” Derrida asks, “would what we call ‘misunderstanding’ be possible?” 

In many ways, likening the brain to a quantum computer could be getting it “backwards”; but as Feynman diagrams (the basis for one kind of time travel) tell us, at the quantum level there is no forward or backward. Maybe the model of the quantum brain is both cause and effect simultaneously.