Zurich Zurich

Google Quantum AI-led Researchers Find ‘Sweet Spot’ to Use Current Quantum Computers to Make Practical Calculations

quantum chips
quantum chips
Quantum Source Quantum Source

Insider Brief

  • A recent Google Quantum AI-led study shows that quantum computers may not need to reach full fault tolerance to outperform classical supercomputers, finding a phase where today’s Noisy Intermediate-Scale Quantum (NISQ) devices can excel.
  • Using a 67-qubit system, the team employed random circuit sampling (RCS) to pinpoint conditions where quantum systems maintain sufficient complexity to outpace classical simulations.
  • The researchers aim to use this stable computational phase to target real-world problems in finance, materials, and life sciences, suggesting that NISQ devices can provide value before fully fault-tolerant systems are developed.

Quantum computers may not need to reach the full fault-tolerant stage in order to perform useful, commercial calculations, a new study suggests.

An international team of researchers, led by Google Quantum AI, report in a new research paper published in Nature that they have identified a complex computational phase — a “sweet spot” — where noisy quantum computers can outperform classical supercomputers.

The discovery could lead to today’s Noisy Intermediate-Scale Quantum Computers (NISQ) outperforming classical supercomputers at certain commercial tasks, Sergio Boixo, Quantum Computing Principal Scientist at Google Quantum AI, told The Quantum Insider. Boixo helped lead the research with Alexis Morvan, research scientist, Google Quantum AI.

Responsive Image

“So the question we’re addressing is this: We’re in the NISQ era, that means we have noisy quantum computers and there has been this question for a long time: ‘Can you find applications where you outperform supercomputers with noisy quantum computers?’” said Boixo, “There has been a lot of theoretical advances towards this question. So that’s the question we address in the paper and we answered, ‘Yes, there is.’ We find a complex computational phase where noisy quantum computers can outperform supercomputers.”

The team conducted experiments on Google’s 67-qubit Sycamore chip that reveal  a “low-noise phase” in transition between phases when computation is sufficiently complex for the quantum computer to outperform classical devices. They also demonstrate beyond-classical performance with that chip.

This experimental evidence indicates that one day noisy quantum computers, when operated under specific conditions, could enter into what the scientists refer to as a “stable computationally complex phase” and best today’s supercomputers in certain tasks. This would be a significant step forward in the Noisy Intermediate-Scale Quantum (NISQ) era, according to the team.

The researchers relied on random circuit sampling, or RCS, a benchmarking technique designed to measure the performance of quantum processors by comparing their output distributions against classical supercomputer simulations. The researchers also added in the paper that they used Cross-Entropy Benchmarking, or XEB, to experimentally identify and characterize phase transitions in the behavior of their quantum system when using random circuit sampling.

To offer some idea of how these techniques operate, RCS works by executing a sequence of randomly generated quantum gates on a set of qubits, producing a complex output distribution that is difficult for classical computers to simulate. In this study, it provides a direct test of the quantum system’s computational capabilities. XEB is a method that calculates the cross-entropy — a measure of the discrepancy or difference in two probability distributions –between the experimentally obtained output distribution and the theoretical distribution generated through classical simulations. Ultimately, scientists can use XEB to assess the accuracy and fidelity of the quantum processor’s performance.

By employing this method on Google’s 67-qubit Sycamore chip, the researchers identified a phase where the quantum system can maintain complex correlations, even in the presence of noise.

Experimental Details and Findings

According to the paper, the research investigates two key phase transitions in quantum systems as they scale up in complexity and face noise challenges. The first is a dynamical transition influenced by the number of cycles, or the depth, of quantum computations. As the depth increases, the system moves from a state where output distributions are concentrated in a small set of bitstrings to a broader distribution—known as anti-concentration. This shift represents the system’s growing computational complexity, but maintaining this state amid noise remains a significant challenge.

The second phase transition, which the researchers emphasize in the study, is controlled by the error rate per cycle—the noise that affects each gate operation and qubit interaction. The study introduced a statistical “weak link” model to analyze this transition, varying the noise levels to understand its impact on the system’s performance. The team found that if the noise rate per cycle is kept below a critical threshold, the quantum processor maintains global correlations across the entire system, allowing it to achieve beyond-classical performance.

In the low-noise phase, these correlations are strong enough to prevent classical algorithms from simplifying and “spoofing” the quantum system’s outputs. Finding this balance where the system maintains global correlations while minimizing noise is key to leveraging the computational power of quantum processors, Boixo said.

To validate their findings, the researchers relied on the aforementioned XEB to measure the system’s fidelity and determine the boundaries where quantum advantage could be achieved. The experiments demonstrated that in this stable, low-noise phase, the Sycamore chip could perform calculations that are currently intractable for classical supercomputers, highlighting a practical advantage even with existing hardware.

Testing the Quantum Versus Classical Limits

The study assessed the computational limits of classical supercomputers using advanced tensor network algorithms to simulate the RCS experiments. The results showed that simulating the 67-qubit experiment on current top-tier supercomputers, such as Frontier, would take tens of years, even under the best memory and bandwidth conditions. This computational burden further supports the notion that today’s quantum technology can achieve tasks beyond the reach of classical systems.

The researchers emphasize that despite advancements in classical simulation algorithms, the complexity of quantum systems remains a significant hurdle for classical computation. Their experiments, which employed a 67-qubit system at 32 cycles, demonstrate that these quantum circuits achieve levels of complexity and depth that classical systems cannot efficiently replicate.

Applications Next?

This study offers crucial insights into the conditions necessary for achieving quantum advantage in the NISQ era. By pinpointing the noise thresholds and using benchmarking techniques like RCS and XEB, the researchers provide a framework for identifying and optimizing the conditions under which quantum processors can outperform classical computers.

The goal now is to tap that stable computational complex phase to use a NISQ device for a useful calculation. The team suggests many computations in finance, materials and life sciences make interesting targets for this next ste.

“We’re showing that we’re winning convincingly on these benchmarks, so indeed, the next step is to move towards applications,” said Boixo.

The discovery can also create a transition period when NISQ quantum computers can provide real value as scientists progress toward fault-tolerant quantum computers.

“Fault tolerant quantum computers, as you know, are a number of years away, so we’re not just going to jump before we get to that fault-tolerant era,” said Boixo. “So it’s going to be a smooth process. And I think a lot of what we’re learning in terms of how to find applications for noisy quantum computers is also going to be very useful for early fault tolerance era.”

The work can also help improve understanding of how noise interacts with quantum dynamics to guide future efforts in error mitigation and pave the way for those fully fault-tolerant quantum systems.

While this research is a necessary step for a transition from early NISQ to useful NISQ, Boixo said there’s still a lot of work to do.

“The next step is, indeed, to move towards applications, which means transforming random circuit sampling or finding some other problem or algorithm where we can do both things at the same time,” said Boixo. “We we want to keep enough of the difficulty of random circuit sampling, which is still hard for for classical supercomputers, while we make it more useful.”

For a more detailed and technical explanation of the researchers work, please read the paper in Nature.

The institutions contributing to this study include Google Research, NASA’s Quantum Artificial Intelligence Laboratory at NASA Ames Research Center, KBR, the University of Connecticut, the National Institute of Standards and Technology (NIST), the University of Massachusetts Amherst, Auburn University, the University of Technology Sydney, the University of California Riverside, and Harvard University.

Matt Swayne

With a several-decades long background in journalism and communications, Matt Swayne has worked as a science communicator for an R1 university for more than 12 years, specializing in translating high tech and deep tech for the general audience. He has served as a writer, editor and analyst at The Quantum Insider since its inception. In addition to his service as a science communicator, Matt also develops courses to improve the media and communications skills of scientists and has taught courses. [email protected]

Share this article:

Keep track of everything going on in the Quantum Technology Market.

In one place.

Related Articles

Join Our Newsletter