In a room at the United Kingdom’s University of Plymouth, a Ph.D. student is sitting at a computer, eyes closed as if he’s meditating. On his head is what looks like a black swimming cap, but is actually an electroencephalogram (EEG) reader that’s sensing the electrical activity passing over his scalp. In front of him, on the monitor, there’s an image of a wireframe globe with two points marked “1” and “0.” In the center of the globe, like a clock with a single hand, is an arrow that oscillates between the two points. As the student changes his expression from one of relaxation to one of wide-eyed agitation, the arrow twitches and moves. Every several seconds, he enters a new digit.
It might not look like much (and right now, it’s still very early days for this work), but it’s nonetheless fascinating stuff. As the student changes his brain patterns from calm to energized and back again, he produces alpha and beta waves that are then used to manipulate simulated qubits – the elemental unit in quantum computing, reflecting the math of quantum physics – using nothing more than the power of thought.
“If you train yourself to produce these two kinds of waves, then you can send some sort of Morse code to the computer,” professor Eduardo Miranda of the University of Plymouth told Digital Trends. “The problem is that it takes eight seconds to generate one command at the moment because the EEG is very slow. We need a lot of processing to analyze it. And this analysis is not so accurate, so we need to keep checking many times to see if the code really is what the person wants to produce.”
Welcome to the somewhat shaky, tentative steps of the world of quantum programming by way of brain-computer interface. According to its creators, it’s the start of construction of what the team calls the Quantum Brain Network (abbreviated to QBraiN). And it’s got the potential to do a bunch of things that are worth getting excited about.
More than the sum of its parts or a toaster-fridge?
If you’ve seen any list of the most exciting technologies currently shimmering on the tech horizon, you’ve almost certainly come across the terms brain-computer interface (BCI) and quantum computer.
A BCI is fancy terminology for a way of controlling a computer using brain signals. While every device with a manual input is technically controlled by the brain – albeit usually via an intermediary like fingers or voice – a BCI makes it possible to send these commands to the outside world without having to first output from the brain to peripheral nerves or muscles.
University of Plymouth
Quantum computers, meanwhile, represent the Next Big Thing in computing. First proposed in the 1980s, although only now starting to become a technical reality, quantum computing refers to a completely new approach to computer architecture. It will not only be far more powerful than existing classical computers, but will also make it possible to achieve things that would be impossible even with millions of today’s supercomputers chained together. They could, if you believe their proponents, be the answer to the inevitable end of Moore’s Law as we know it.
However, while BCIs and quantum computers are undoubtedly promising technologies emerging at the same point in history, the question is why bring them together – which is exactly what the consortium of researchers from the U.K.’s University of Plymouth, Spain’s University of Valencia and University of Seville, Germany’s Kipu Quantum, and China’s Shanghai University are seeking to do.
Taking two must-have technologies and combining them doesn’t always work, though.
Technologists love nothing more than mashing together promising concepts or technologies in the belief that, when united, they will represent more than the sum of their parts. Sometimes this works gloriously. As the venture capitalist Andrew Chen describes in his book The Cold Start Problem, Instagram leveraged the emergence of camera-equipped smartphones and the simultaneous powerful network effects of social media to become one of the fastest-growing apps in history.
Taking two must-have technologies and combining them doesn’t always work, though. Apple CEO Tim Cook once quipped that “you can converge a toaster and a refrigerator, but, you know, those things are probably not going to be pleasing to the user.”
So what makes brain-controlled quantum computing an example of the former, a member of the more-than-the-sum-of-its-parts club, and not symptomatic of the toaster-fridge problem? In a paper published in early 2022, the aforementioned consortium of researchers write that: “We foresee the development of highly connected networks of wetware and hardware devices, processing classical and quantum computing systems, mediated by brain-computer interfaces and A.I. Such networks will involve unconventional computing systems and new modalities of human-machine interaction.”
Use cases galore
The most significant – and, if it works, immediately transformative – application of the Quantum Brain Network is that it will help BCIs to work better. Our brains are incredibly complex. They boast 100 billion neurons, forming giant networks with quadrillions of connections in constant communication with one another via tiny electrical impulses. Today, science is capable of recording the way parts of the brain communicate, from the smallest neuron-to-neuron interaction to larger communications between neuron networks.
But doing this typically involved highly specialized technology, such as functional magnetic resonance imaging (fMRI), that’s only available in top research labs. The BCI experiments that rely on the blunt instrument of EEG tend to be comparatively simplistic in what they can do: Say, deciding whether a person is thinking of the color blue or red, or making a drone move up and down or left and right. They lack nuance.
Glenn Asakawa/The Denver Post via Getty Images
That’s now changing, Miranda explained. “We are beginning to have access to good hardware. Increasingly better EEG scanning is coming out.”
Better brain wave-sensing hardware is only one piece of the puzzle, though. As an analogy, imagine having an extraordinarily accurate microphone placed in the middle of a football stadium. The microphone is so powerful that it’s able to pick up every sound made by the thousands of fans in the stadium, regardless of whether they’re cheering loudly or quietly munching on a hotdog. However, as impressive as this would be, without the right audio-filtering software, you would be unable to do more than listen to an aggregated, shapeless mass of crowd noise. On its own, such a microphone wouldn’t help you determine, for example, what’s being said by the person in seat 77A.
What you need isn’t just the ability to record this information, but also to decode it and make it useful. And quickly. This is what quantum computing could do by using its superior abilities to help better process the unimaginable quantity of electrical brain impulses that are needed to understand intentions and thoughts as they occur.
“BCI needs real-time control,” Miranda continued. “I think quantum computing can provide the speed we need to do this processing… [Right now] we can’t figure out what all this messy information we are getting with the EEG means. If we could, then we could begin to classify the signals and label certain behaviors that we force ourselves to produce.”
Chris DeGraw/Digital Trends, Getty Images
Perhaps straining to produce these behaviors wouldn’t even be necessary. As Azeem Azhar writes in his 2021 book Exponential, the promise of brain-computer interfaces is to be able to “pluck neural activity from our heads even before it forms into thought.” Just like recommender systems – such as those employed by Spotify, Netflix and Amazon – seek to show us what we want to consume before we’ve even decided for ourselves, so too will BCIs read our barely conscious thought patterns and extrapolate useful information from them.
That could be controlling a smart home or a robot, popping up the right contextual information at the right moment, or providing more fine-grained movement to a neural-controlled prosthesis. In Miranda’s pet use case, one he’s been working on for years, it could help people with locked-in syndrome to better communicate rapidly with the outside world.
The quantum metaverse?
Then there’s the possibility of using the brain to interact with a quantum computer itself, rather than just using it to bootstrap processing. “In the future, it may be possible to affect quantum states in a quantum machine with mental states,” said Miranda. “I will not go as far as saying that we’ll be able to entangle our brain with quantum computers, but we will be able to have a more direct communication with quantum states.”
That could be programming a quantum computer not in the clunky way of the demonstration, but simply by thinking of a desired output and letting the machine program the right code instantly. Picture it like evolutionary computing (where you state a desired output and let the machine figure out the creative path to it) on superposition steroids.
Chris DeGraw/Digital Trends, Getty Images
Some of the researchers on the project are also excited at the prospect of creating what they term a quantum metaverse. (And if you think the current concept of the regular metaverse is fuzzy around the edges, try and wrap your head around its quantum equivalent!). Somehow, though, the idea makes a lot of sense. A.I. researchers have long imagined – and, really, this underpins the entire notion of true artificial intelligence – that the wetware of the brain could be recreated through hardware and software. Since at least the 1990s, some leading physicists and mathematicians have been arguing that the nature of consciousness is, in fact, quantum.
For example, a 2011 paper co-authored by the world-renowned Oxford mathematical physicist Roger Penrose argues that “consciousness depends on biologically orchestrated quantum computations in collections of microtubules within brain neurons, that these quantum computations correlate with and regulate neuronal activity, and that the continuous Schrödinger evolution of each quantum computation terminates in accordance with the specific Diósi–Penrose (DP) scheme of ‘objective reduction’ of the quantum state.”
“There is a lot of philosophical debate going on saying that the brain functions as a quantum computer,” Miranda explained. “People are dreaming that perhaps it’s possible that if we managed to connect our brains with a quantum machine, then we become an extension of the machine or the machine becomes an extension of our brain.”
(Miranda said that he is not personally “entirely convinced” by the argument that brains act like quantum computers.)
Step one in a long journey
For now, much of this is far-off – and far-out. Advances will need to be made in multiple areas: The availability of quantum computers (the demo described earlier was carried out using a simulated quantum computer), the usefulness of quantum algorithms, continued improvements in brain-reading technology, and much more.
The next step, said project participant professor Enrique Solano, director of the research group Quantum Technologies for Information Science (QUTIS), is “to go for a trapped-ion [quantum computer] or one based in spin qubits, which work at room temperature, and assure that latency and coherence times become compatible.”
Opening this Pandora’s Box of brain-controlled quantum computing is going to be difficult. We’re talking about years before this becomes practical for more than just a few promising demos. But the biggest innovations often take time.
“The brain is the most complex object we know up to now in the universe,” Solano told Digital Trends. “In this sense, if you connect it with a primitive interface, you have to accept an oversimplified model of it with minimal biological and intelligent features.”
Quantum computing may be the solution to that problem. Welcome to the Quantum Brain Network, indeed.