Zurich Zurich

Quantum AI May Need Only Minimal Data — Proof Takes Step Toward Quantum Advantage

Binary numbers data matrix
Binary numbers data matrix
Quantum Source Quantum Source

Insider Brief

  • Experts worry that quantum artificial intelligence or machine learning programs will require an enormous amount of data, but scientists from Los Alamos National Laboratory suggest that assumption is wrong.
  • A mathematical proof indicates that for certain relevant problems, it will not require large amounts of data to train quantum AI models.
  • The work could lead to advantages for quantum-based AI and ML compared to classical AI approaches.

RESEARCH NEWS — Training a quantum neural network requires only a small amount of data, according to a new proof that upends previous assumptions stemming from classical computing’s huge appetite for data in machine learning, or artificial intelligence. The theorem has several direct applications, including more efficient compiling for quantum computers and distinguishing phases of matter for materials discovery.

“Many people believe that quantum machine learning will require a lot of data. We have rigorously shown that for many relevant problems, this is not the case,” said Lukasz Cincio, a quantum theorist at Los Alamos National Laboratory and co-author of the paper containing the proof published in the journal Nature Communications. “This provides new hope for quantum machine learning. We’re closing the gap between what we have today and what’s needed for quantum advantage, when quantum computers outperform classical computers.”

“The need for large data sets could have been a roadblock to quantum AI, but our work removes this roadblock. While other issues for quantum AI could still exist, at least now we know that the size of the data set is not an issue,” said Patrick Coles, a quantum theorist at the Laboratory and co-author of the paper.

Responsive Image

All AI systems need data to train the neural networks to recognize — that is, generalize to — unseen data in real applications. It had been assumed that the number of parameters, or variables, would be determined by the size of a mathematical construct called a Hilbert space, which becomes exponentially large for training over large numbers of qubits. That size rendered this approach nearly impossible computationally. A qubit, or quantum bit, is the basic computational unit of quantum computing and is analogous to a bit in classical computing.

“It is hard to imagine how vast the Hilbert space is: a space of a billion states even when you only have 30 qubits,” Coles said. “The training process for quantum AI happens inside this vast space. You might think that searching through this space would require a billion data points to guide you. But we showed you only need as many data points as the number of parameters in your model. That is often roughly equal to the number of qubits — so only about 30 data points,” Coles said.

One key aspect of the results, Cincio said, is that they yield efficiency guarantees even for classical algorithms that simulate quantum AI models, so the training data and compilation can often be handled on a classical computer, which simplifies the process. Then the machine-learned model runs on a quantum computer.

“That means we can lower the requirement for the performance quality that we need from the quantum computer, with respect to noise and errors, to perform meaningful quantum simulations, which pushes quantum advantage closer and closer to reality,” Cincio said.

Dramatic practical applications

The speed-up resulting from the new proof has dramatic practical applications. The team found they could guarantee that a quantum model can be compiled, or prepared for processing on a quantum computer, in far fewer computational gates, relative to the amount of data. Compiling, a crucial application for the quantum computing industry, can shrink a long sequence of operational gates or turn the quantum dynamics of a system into a gate sequence.

“Our theorem will lead to much better compilation tools for quantum computing,” Cincio said. “Especially with today’s noisy, intermediate-scale quantum computers where every gate counts, you want to use as few gates as possible so you don’t pick up too much noise, which causes errors.”

The team also showed that a quantum AI could classify quantum states across a phase transition after training on a very small data set.

“Classifying the phases of quantum matter is important to materials science and relevant to the mission of Los Alamos,” said Andrew Sornborger, director of the Quantum Science Center at the Laboratory and co-author of the paper. “These materials are complex, having multiple distinct phases like superconducting and magnetic phases.”

Creating materials with desired traits, such as superconductivity, involves understanding the phase diagram, Sornborger said, which the team proved could be discovered by a machine-learning system with minimal training.

Other potential applications of the new theorem include learning quantum error correcting codes and quantum dynamical simulations.

Exceeding expectations

“The efficiency of the new method exceeded our expectations,” said Marco Cerezo, a Los Alamos expert in quantum machine learning. “We can compile certain, very large quantum operations within minutes with very few training points—something that was not previously possible.”

“For a long time we could not believe that the method would work so efficiently,” Cincio said. “With the compiler, our numerical analysis shows it’s even better than we can prove. We only have to train on a small number of states out of billions that are possible. We don’t have to check every option, but only a few. This tremendously simplifies the training.”

Paper: Generalization in quantum machine learning from few training data, by Matthias C. Caro, Hsin-Yuan Huang, M. Cerezo, Kunal Sharma, Andrew Sornborger, Lukasz Cincio and Patrick J. Coles, in Nature Communications.

Funding (Los Alamos co-authors only): ASC Beyond Moore’s Law project at Los Alamos National Laboratory; U.S. Department of Energy Office of Science, Office of Advanced Scientific Computing Research Accelerated Research in Quantum Computing program; Laboratory Directed Research and Development program at Los Alamos National Laboratory; DOE Office of Science, National Quantum Information Science Re-search Centers, Quantum Science Center; and Department of Defense.

Source: Los Alamos National Laboratory

For more market insights, check out our latest quantum computing news here.

Matt Swayne

With a several-decades long background in journalism and communications, Matt Swayne has worked as a science communicator for an R1 university for more than 12 years, specializing in translating high tech and deep tech for the general audience. He has served as a writer, editor and analyst at The Quantum Insider since its inception. In addition to his service as a science communicator, Matt also develops courses to improve the media and communications skills of scientists and has taught courses. [email protected]

Share this article:

Keep track of everything going on in the Quantum Technology Market.

In one place.

Related Articles

Join Our Newsletter