MIT News reports that researchers have introduced a quantum computing architecture that can perform low-error quantum computations while also rapidly sharing quantum information between processors. The work represents a key advance toward a complete quantum computing platform.
Before, small-scale quantum processors have successfully performed tasks at a rate exponentially faster than that of classical computers. However, it has been difficult to controllably communicate quantum information between distant parts of a processor. In classical computers, wired interconnects are used to route information back and forth throughout a processor during the course of a computation. In a quantum computer, however, the information itself is quantum mechanical and fragile, requiring fundamentally new strategies to simultaneously process and communicate quantum information on a chip.
“One of the main challenges in scaling quantum computers is to enable quantum bits to interact with each other when they are not co-located,” says William Oliver, an associate professor of electrical engineering and computer science, MIT Lincoln Laboratory fellow, and associate director of the Research Laboratory for Electronics. “For example, nearest-neighbor qubits can easily interact, but how do I make ‘quantum interconnects’ that connect qubits at distant locations?”
The answer lies in going beyond conventional light-matter interactions.
While natural atoms are small and point-like with respect to the wavelength of light they interact with, in a paper published today in the journal Nature, the researchers show that this need not be the case for superconducting “artificial atoms.” Instead, they have constructed “giant atoms” from superconducting quantum bits, or qubits, connected in a tunable configuration to a microwave transmission line, or waveguide.
This allows the researchers to adjust the strength of the qubit-waveguide interactions so the fragile qubits can be protected from decoherence, or a kind of natural decay that would otherwise be hastened by the waveguide, while they perform high-fidelity operations. Once those computations are carried out, the strength of the qubit-waveguide couplings is readjusted, and the qubits are able to release quantum data into the waveguide in the form of photons, or light particles.
“Coupling a qubit to a waveguide is usually quite bad for qubit operations, since doing so can significantly reduce the lifetime of the qubit,” says Bharath Kannan, MIT graduate fellow and first author of the paper. “However, the waveguide is necessary in order to release and route quantum information throughout the processor. Here, we’ve shown that it’s possible to preserve the coherence of the qubit even though it’s strongly coupled to a waveguide. We then have the ability to determine when we want to release the information stored in the qubit. We have shown how giant atoms can be used to turn the interaction with the waveguide on and off.”
The system realized by the researchers represents a new regime of light-matter interactions, the researchers say. Unlike models that treat atoms as point-like objects smaller than the wavelength of the light they interact with, the superconducting qubits, or artificial atoms, are essentially large electrical circuits. When coupled with the waveguide, they create a structure as large as the wavelength of the microwave light with which they interact.
The giant atom emits its information as microwave photons at multiple locations along the waveguide, such that the photons interfere with each other. This process can be tuned to complete destructive interference, meaning the information in the qubit is protected. Furthermore, even when no photons are actually released from the giant atom, multiple qubits along the waveguide are still able to interact with each other to perform operations. Throughout, the qubits remain strongly coupled to the waveguide, but because of this type of quantum interference, they can remain unaffected by it and be protected from decoherence, while single- and two-qubit operations are performed with high fidelity.
“We use the quantum interference effects enabled by the giant atoms to prevent the qubits from emitting their quantum information to the waveguide until we need it.” says Oliver.
“This allows us to experimentally probe a novel regime of physics that is difficult to access with natural atoms,” says Kannan. “The effects of the giant atom are extremely clean and easy to observe and understand.”
The work appears to have much potential for further research, Kannan adds.
“I think one of the surprises is actually the relative ease by which superconducting qubits are able to enter this giant atom regime.” he says. “The tricks we employed are relatively simple and, as such, one can imagine using this for further applications without a great deal of additional overhead.”
Andreas Wallraff, professor of solid-state physics at ETH Zurich, says the research “investigates a piece of quantum physics that is hard or even impossible to fathom for microscopic objects such as electrons or atoms, but that can be studied with macroscopic engineered superconducting quantum circuits. With these circuits, using a clever trick, they are able both to protect their giant atom from decay and simultaneously to allow for coupling two of them coherently. This is very nice work exploring waveguide quantum electrodynamics.”
The coherence time of the qubits incorporated into the giant atoms, meaning the time they remained in a quantum state, was approximately 30 microseconds, nearly the same for qubits not coupled to a waveguide, which have a range of between 10 and 100 microseconds, according to the researchers.
Additionally, the research demonstrates two-qubit entangling operations with 94 percent fidelity. This represents the first time researchers have quoted a two-qubit fidelity for qubits that were strongly coupled to a waveguide, because the fidelity of such operations using conventional small atoms is often low in such an architecture. With more calibration, operation tune-up procedures and optimized hardware design, Kannan says, the fidelity can be further improved.
A team of researchers from California Institute of Technology recently introduced a new method that can be used to predict multiple properties of complex quantum systems from a limited number of measurements, according to Phys.org. The method, outlined in a paper published in Nature Physics, has been found to be highly efficient and could open up new possibilities for studying the ways in which machines process quantum information.
“During my undergraduate, my research centered on statistical machine learning and deep learning,” Hsin-Yuan Huang, one of the researchers who carried out the study, told Phys.org. “A central basis for the current machine-learning era is the ability to use highly parallelized hardware, such as graphical processing units (GPU) or tensor processing units (TPU). It is natural to wonder how an even more powerful learning machine capable of harnessing quantum-mechanical processes could emerge in the far future. This was my aspiration when I started my Ph.D. at Caltech.”
The first step toward the development of more advanced machines based on quantum-mechanical processes is to gain a better understanding of how current technologies process and manipulate quantum systems and quantum information. The standard method for doing this, known as quantum state tomography, works by learning the entire description of a quantum system. However, this requires an exponential number of measurements, as well as considerable computational memory and time.
As a result, when using quantum state tomography, machines are currently unable to support quantum systems with over tens of qubits. In recent years, researchers have proposed a number of techniques based on artificial neural networks that could significantly enhance the quantum information processing of machines. Unfortunately, however, these techniques do not generalize well across all cases, and the specific requirements that allow them to work are still unclear.
“To build a rigorous foundation for how machines can perceive quantum systems, we combined my previous knowledge about statistical learning theory with Richard Kueng and John Preskill’s expertise on a beautiful mathematical theory known as unitary t-design,” Huang said. “Statistical learning theory is the theory that underlies how the machine could learn an approximate model about how the world behaves, while unitary t-design is a mathematical theory that underlies how quantum information scrambles, which is central to understand quantum many-body chaos, in particular, quantum black holes.”
By combining statistical learning and unitary t-design theory, the researchers were able to devise a rigorous and efficient procedure that allows classical machines to produce approximate classical descriptions of quantum many-body systems. These descriptions can be used to predict several properties of the quantum systems that are being studied by performing a minimal number of quantum measurements.
“To construct an approximate classical description of the quantum state, we perform a randomized measurement procedure given as follows,” Huang said. “We sample a few random quantum evolutions that would be applied to the unknown quantum many-body system. These random quantum evolutions are typically chaotic and would scramble the quantum information stored in the quantum system.”
The random quantum evolutions sampled by the researchers ultimately enable the use of the mathematical theory of unitary t-design to study such chaotic quantum systems as quantum black holes. In addition, Huang and his colleagues examined a number of randomly scrambled quantum systems using a measurement tool that elicits a wave function collapse, a process that turns a quantum system into a classical system. Finally, they combined the random quantum evolutions with the classical system representations derived from their measurements, producing an approximate classical description of the quantum system of interest.
“Intuitively, one could think of this procedure as follows,” Huang explained. “We have an exponentially high-dimensional object, the quantum many-body system, that is very hard to grasp by a classical machine. We perform several random projections of this extremely high-dimension object to a much lower dimensional space through the use of random/chaotic quantum evolution. The set of random projections provides a rough picture of how this exponentially high dimensional object looks, and the classical representation allows us to predict various properties of the quantum many-body system.”
Huang and his colleagues proved that by combining statistical learning constructs and the theory of quantum information scrambling, they could accurately predict M properties of a quantum system based solely on log(M) measurements. In other words, their method can predict an exponential number of properties simply by repeatedly measuring specific aspects of a quantum system for a specific number of times.
“The traditional understanding is that when we want to measure M properties, we have to measure the quantum system M times,” Huang said. “This is because after we measure one property of the quantum system, the quantum system would collapse and become classical. After the quantum system has turned classical, we cannot measure other properties with the resulting classical system. Our approach avoids this by performing randomly generated measurements and infer the desired property by combining these measurement data.”
The study partly explains the excellent performance achieved by recently developed machine learning (ML) techniques in predicting properties of quantum systems. In addition, its unique design makes the method they developed significantly faster than existing ML techniques, while also allowing it to predict properties of quantum many-body systems with a greater accuracy.
“Our study rigorously shows that there is much more information hidden in the data obtained from quantum measurements than we originally expected,” Huang said. “By suitably combining these data, we can infer this hidden information and gain significantly more knowledge about the quantum system. This implies the importance of data science techniques for the development of quantum technology.”
The results of tests the team conducted suggest that to leverage the power of machine learning, it is first necessary to attain a good understanding of intrinsic quantum physics mechanisms. Huang and his colleagues showed that although directly applying standard machine-learning techniques can lead to satisfactory results, organically combining the mathematics behind machine learning and quantum physics results in far better quantum information processing performance.
“Given a rigorous ground for perceiving quantum systems with classical machines, my personal plan is to now take the next step toward creating a learning machine capable of manipulating and harnessing quantum-mechanical processes,” Huang said. “In particular, we want to provide a solid understanding of how machines could learn to solve quantum many-body problems, such as classifying quantum phases of matter or finding quantum many-body ground states.”
This new method for constructing classical representations of quantum systems could open up new possibilities for the use of machine learning to solve challenging problems involving quantum many-body systems. To tackle these problems more efficiently, however, machines would also need to be able to simulate a number of complex computations, which would require a further synthesis between the mathematics underlying machine learning and quantum physics. In their next studies, Huang and his colleagues plan to explore new techniques that could enable this synthesis.
“At the same time, we are also working on refining and developing new tools for inferring hidden information from the data collected by quantum experimentalists,” Huang said. “The physical limitation in the actual systems provides interesting challenges for developing more advanced techniques. This would further allow experimentalists to see what they originally could not and help advance the current state of quantum technology.”
Since the dawn of time, diseases have been with us. Doesn’t matter the kind: viral or bacterial, the germs don’t discriminate.
Plague
In the history of the world, bacterial infections have ravaged humanity like wars or genocides couldn’t. The Plague of Athens of 430 BC is the first that comes to mind. In all, it is estimated between 75,000 and 100,000 people perished, a huge number considering the population of Athens at the time was around 400,000, which means a quarter of the citizens succumbed to the pathogenic bacterium Salmonella enterica serovar Typhi.
More than a millennium and a half later, during the Middle Ages, the Black Death — carried by rat fleas living on black rats — caused utter chaos around the world. 100 million people are said to have perished from the devilish claws of the bacterium Yersinia pestis.
These bacterial diseases — though some scientists are now saying the Athen’s debacle was caused by Ebola, a viral disease — changed the way people lived for generations.
The current Covid-19 pandemic, though viral in nature, just goes to show — like the historical examples mentioned — how much devastation and economic meltdown diseases can cause. That is why governments, NGOs, corporations, and startups alike are doing their utmost to find a vaccine or cure for it.
It could be a long time. Maybe never. But that doesn’t mean we shouldn’t try.
In all this, technology has a big role to play. The hard sciences, represented by the shock troops of AI and ML, will need to get their act together if they are to find a solution.
Yet, they may not even be needed if quantum information sciences and nanotechnology have any say in it.
The very small kicking the very big problems aside.
Already we can see dozens of startups in the pharmaceutical industry with their own unique solution using the power of qubits.
Rahko from the UK. California-based ApexQubit. Pharmacelera from Spain. These are just three that employ the magic of quantum mechanics for drug discovery solutions.
FluoretiQ, a startup founded in 2016 in the coastal city of Bristol in the UK, believes it’s on the path of revolutionizing bacterial diagnostics.
— FluoretiQ
FluoretiQ
The startup’s secret weapon is its diagnostics platform called NANOPLEX™. The platform uses receptor mediating sensing to quickly identify bacterial infections. It does, according to FluoretiQ’s website, ‘address[…] the £1.1 Billion cost of using broad-spectrum antibiotics as a placeholder for accurate diagnosis and prescription.’
FluoretiQ promises its platform will do three special things by
Saving Time:
Today’s tools for the identification of bacterial infection can often leave patients waiting days, sometimes weeks for accurate diagnosis and treatment. Delays in effective identification of infection lead to multiple GP visits, long-term antibiotic use, hospitalization and in some cases death.
Saving Money:
New solutions for fast, accurate bacterial diagnosis are urgently required to prevent the continued rise of Antimicrobial resistance and preserve the efficacy of existing antibiotics.
And the last, which is for those who don’t have a misanthropic bone in their skeletal frame:
Saving Lives:
Pioneering across the disciplines of Nanotechnology, Microbiology and Glycan Chemistry — FluoretiQ is designing simple and effective bacterial detection systems for rapid diagnosis of infection.
Responsible for all these revelations is the founding team, made up of CEO Neciah Dorh, CTO Josephine Dorh and Martin Cryan, FluoretiQ’s scientific advisor.
CEO Neciah Dorh is a former Enterprise Fellow at the Quantum Technology Enterprise Centre. With a Ph.D. in electrical and electronics engineering from the University of Bristol, he is currently a member of the Department of International Trade’s Global Entrepreneur program while being included as an Insider Magazine’s 2019 42 under 42 upcoming business leaders list. He has also been a TEDx Bristol keynote speaker.
Josephine Dorh is CTO and the co-inventor of FluoretiQ’s proprietary fluorometer system. Like the CEO, she’s a graduate in electrical and electronic engineering from the University of Bristol and an Insider Magazine 2019 Insider 42 under 42 alumna.
Martin Cryan, however, might hold the key to the startup’s onward motion. A professor of applied electromagnetics at the University of Bristol, he has three decades of experience in electronic and photonic device development and research.
The south-west of England has garnered some headlines of late in the QC world.
The University of Bristol has been a hotbed of British quantum computing activity for quite a while now. PsiQuantum, which raised one of the largest private investments in QC in recent years valued at $215M involving nearly a dozen VCs, is a prime example.
But the story doesn’t end happily, unfortunately — and this was before the announcement of the investment — the PsiQuantum team moved to Silicon Valley, all hush-hush.
Is this a sign of things to come? Does the UK have enough money and spirit to hold onto its QC talent?
Let’s hope the guys at FluoretiQ feel the UK, with the Brexit final deal just a few precious months away, feel the country can offer them the best chance to build their brand.