Zurich Zurich

Quantum Myth Busters: Experts Debunk Common NISQ-Era Myths

Quantum Source Quantum Source

Insider Brief:

  • The age of information amplifies the spread of both knowledge and misinformation, creating unique challenges for navigating complex fields like quantum technology. Claims range from overhyped promises to stringent no-go theorems.
  • A recent study, led by Algorithmiq and involving experts from top quantum organizations, critically evaluates myths surrounding quantum computing, addressing issues like error mitigation, algorithm scalability, and practical applications.
  • Key findings debunk misconceptions about the impracticality of quantum error mitigation and the irrelevance of variational quantum algorithms beyond the NISQ era, while also clarifying the complementary roles of quantum error correction and mitigation.
  • Experts emphasize that while exponential speedups for practical problems remain unproven, focused advancements in hardware and algorithm design may lead to meaningful applications for quantum computing in areas such as optimization and quantum simulations.

The age of information is both one of the most profound eras of our time and one of the most challenging to navigate. In a world where information abounds, it serves as a double-edged sword—consuming vast amounts of our time in its consumption and fueling an exponential spread of falsehoods that are increasingly difficult to counteract. Quantum technology is not immune to these challenges, with claims ranging from unbridled quantum hype to the limitations imposed by quantum no-go theorems. Amid this sea of contradictory information, a recent study published on arXiv, led by Algorithmiq and joined by experts from Quantum Motion, the Technology Innovation Institute, IBM Quantum, Google Quantum AI, NVIDIA, AWS, Phasecraft, and others, critically examines and evaluates common myths surrounding quantum computing.

MYTH 1: Quantum Error Mitigation Cannot Be Useful Due to Exponential Scaling

Quantum error mitigation is intended to reduce errors in quantum computations without the need for full fault-tolerant systems. Critics argue that the number of samples required for QEM grows exponentially with the size of the quantum circuit, which effectively renders it impractical. This criticism is based on no-go theorems that analyze the impact of noise on circuit performance. However, the paper highlights that this scaling includes the error rate, which modern hardware continues to reduce. For instance, with sufficiently low gate error rates, QEM can handle circuits of practical sizes without overwhelming computational overhead. The authors also point out that as error rates decrease, QEM methods will remain relevant, even as we transition to early fault-tolerant systems.

VERDICT: Debunked

Responsive Image

Although scaling challenges exist, the usefulness of QEM depends on continued improvements in hardware. Current and near-future error rates make QEM feasible for certain applications, providing a bridge to more advanced quantum computing systems.

MYTH 2: Practical Problems Require Larger Circuits Than Possible Without Error Correction

It is often assumed that solving practical problems with quantum computers requires circuit sizes far beyond the capability of current NISQ devices. While it is true that larger circuits are needed for many tasks, recent advances have demonstrated that carefully optimized smaller circuits can achieve meaningful outcomes. For example, current devices can perform up to thousands of circuit operations per second, enabling the execution of medium-sized circuits with practical sampling rates. Applications like quantum simulations and certain optimization problems can fit within these limits. However, the study notesthese successes are primarily exploratory, and there is still no clear demonstration of practical quantum advantage for commercially relevant problems.

VERDICT: Open Question, Debunked

While limitations exist, the potential for practical applications with smaller circuits is real. Efforts should focus on minimizing circuit size and identifying tasks that can demonstrate clear quantum advantage within these constraints.

MYTH 3: Quantum Error Mitigation Will Become Obsolete in the Fault-Tolerant Era

A common belief is that once we achieve fault-tolerant quantum computers, quantum error mitigation will no longer serve a use. However, the authors argue that QEM and QEC serve complementary purposes. QEC focuses on reducing logical error rates by encoding information across multiple qubits, but it cannot address certain algorithmic errors or compilation issues. QEM, on the other hand, can target these challenges by applying techniques like zero-noise extrapolation and error-aware compilation. For instance, in early fault-tolerant systems, logical error rates may still be significant, and QEM can help refine overall performance. Additionally, QEM can address specific challenges like synthesizing precise quantum gate operations, which are difficult to achieve even with QEC.

VERDICT: Debunked

QEM will remain relevant, especially in early fault-tolerant systems and for tasks that QEC alone cannot solve. Its role will likely evolve alongside improvements in quantum hardware.

MYTH 4: Variational Quantum Algorithms Require Exponential Training Efforts and Are Unusable

Variational quantum algorithms are hybrid algorithms that combine classical optimization with quantum computations. They have been criticized for facing fundamental obstacles, such as barren plateaus—a phenomenon where the optimization landscape becomes so flat that it is nearly impossible to find good solutions. The paper explains that barren plateaus are not always inevitable and depend on the problem being solved and the choice of initial parameters. Additionally, problem-specific ansätze–customized quantum circuit designs–can help navigate these challenges. For example, in quantum chemistry, using chemically motivated ansätze has proven effective. While challenges like scaling and training complexity remain, researchers are exploring hybrid strategies where quantum computers are used for specific parts of the computation, leaving classical computers to handle the rest.

VERDICT: Open Question

While unstructured VQAs undeniably face significant training challenges, tailored approaches and hybrid models may hold promise for practical use, especially in solving domain-specific problems.

MYTH 5: Variational Quantum Algorithms Are Only Relevant in the NISQ Era.

Variational quantum algorithms are often associated with the current NISQ era due to their adaptability to noisy hardware. This has led to the misconception that their usefulness will decline once fault-tolerant quantum computers become available. However, the paper argues that VQAs are not inherently tied to NISQ hardware. Their ability to optimize parameters and approximate solutions can be integrated into fault-tolerant algorithms. For example, they may serve as subroutines for more complex algorithms, such as quantum phase estimation. Moreover, fault-tolerant systems could enable improvements in VQA performance, such as better sampling rates and error mitigation strategies. This would extend their applicability to a wider range of problems, including machine learning and optimization tasks.

VERDICT: Debunked

VQAs will likely remain relevant beyond the NISQ era, evolving to leverage the capabilities of fault-tolerant quantum computers and continuing to play a role in hybrid computational strategies.

MYTH 6: No Proven Exponential Speedups for Practical Applications

While algorithms like Shor’s provide theoretical evidence for exponential speedups, many commercially relevant problems, such as those in quantum chemistry or optimization, lack similar guarantees. The paper explains that proving speedups for these problems is challenging because they often involve real-world constraints and complexities that deviate from theoretical idealizations. Instead of focusing solely on exponential speedups, researchers are exploring areas where quantum computers can offer practical advantages, such as simulating quantum systems or solving specialized optimization problems. For example, certain chemical simulations that are intractable for classical methods might become feasible on quantum devices with modest improvements in hardware.

VERDICT: Correct, Open Question

Proving exponential speedups remains a challenge, but there is strong potential for quantum computers to provide practical, if not exponential, advantages for specific applications.

The Vital Role of Expert Voices

Debunking quantum myths is more than just an exercise in setting the record straight—it’s a necessary act of stewardship in the age of both information and misinformation. As quantum technology continues to evolve, the voices of experts willing to dissect complexity are invaluable. Their work ensures that progress is guided by clarity, not confusion, and that the potential of quantum computing is neither overhyped nor unnecessarily constrained by misunderstanding.

Contributing authors on the study include Zoltán Zimborás, Bálint Koczor, Zöe Holmes, Elsi-Mari Borrelli, András Gilyén, Hsin-Yuan Huang, Zhenyu Cai, Antonio Acín, Leandro Aolita, Leonardo Banchi, Fernando G. S. L. Brandão, Daniel Cavalcanti, Toby Cubitt, Sergey N. Filippov, Guillermo García-Pérez, John Goold, Orsolya
Kálmán, Elica Kyoseva, Matteo A.C. Rossi, Boris Sokolov, Ivano Tavernelli, and Sabrina Maniscalco.

Cierra Choucair

Share this article:

Keep track of everything going on in the Quantum Technology Market.

In one place.

Related Articles

Index

Join Our Newsletter