Insider Brief
- Quantinuum is advancing Generative Quantum AI (Gen QAI), leveraging quantum systems to overcome the scalability, efficiency, and cost limitations of classical AI.
- Key innovations include quantum word embeddings, quantum recurrent neural networks and tensor networks, offering competitive performance with reduced computational resources.
- The company highlights quantum computing’s potential to make AI more energy-efficient and sustainable, which could prompt a transformative shift in how AI models are developed and deployed.
Quantum computers will provide artificial intelligence with a transformative boost by addressing the inefficiencies and limitations of today’s classical systems, according to a blog post from Quantinuum. The company’s research and experimentation in this subject over a number of years, supported by leaders such as Professor Stephen Clark, head of AI, who have been in the field for decades, suggests that quantum systems could reduce energy costs, improve scalability, and enable fundamentally new approaches to natural language processing (NLP).
Clark and the overall head of Algorithms at Quantinuum, Professor Harry Buhrman, are spearheading the building of what would be the world’s first Generative Quantum Ai System (“Gen QAi”) that needs to be based on a quantum computer that cannot be simulated classically.
Current large language models (LLMs), like ChatGPT, rely on immense computational resources to train and operate, the team writes in the post. Training GPT-3 alone consumed nearly 1,300 megawatt-hours of electricity – equivalent to the annual energy use of 130 average U.S. homes. These systems also often require thousands of specialized processors to handle datasets with billions of parameters.
“Despite these challenges, the push to develop ever-larger models shows no signs of slowing down,” the team writes.
That trend cannot continue unless today’s AI systems can be re-tooled or re-imagined.
“Enter quantum computing,” the team writes.
Quantum computing offers an alternative. By leveraging principles of quantum mechanics, such as entanglement and interference, quantum systems can address these issues in ways classical computers cannot. Quantinuum argues that quantum approaches could reshape AI by dramatically lowering operational costs and enabling scalable growth.
Quantinuum researchers have increased the frequency and depth of disclosure about their work in AI over the past few months, and the latest blog post, published today states: “Quantum technology offers a more sustainable, efficient, and high-performance solution—one that will fundamentally reshape AI, dramatically lowering costs and increasing scalability, while overcoming the limitations of today’s classical systems.”
Reimagining NLP With Quantum Systems
Quantinuum has focused on rethinking machine learning techniques for NLP. Instead of merely porting classical methods to quantum hardware, the team is reimagining these approaches to take full advantage of quantum properties. It has long been recognized that this approach is the only one that will be able to benefit from Quantum Computing.
Quantum systems are fundamentally different from classical systems, according to the post. This includes leveraging quantum phenomena to map models directly onto quantum architectures, enabling the possibility of unique computational advantages.
The company’s research team includes, in addition to Dr. Clark, who previously worked at DeepMind, Dr. Konstantinos Meichanetzidis, a specialist in quantum physics and AI. They are developing quantum-specific innovations in NLP, such as quantum word embeddings and quantum recurrent neural networks (RNNs).
Innovations in Quantum NLP
Quantinuum has spurred advances in the development of quantum word embeddings, which use complex numbers instead of the real-valued vectors employed in classical models like Word2Vec. In quantum mechanics, the state of a system is represented in a complex vector space, known as a Hilbert space. According to the blog, this approach provides richer representations that better capture the probabilistic and hierarchical structure of natural language.
Quantum RNNS Get Rave Reviews
Another innovation is the above-mentioned quantum RNNs. Classical RNNs are commonly used for tasks like text classification and language modeling. Quantinuum’s quantum version, developed using parameterized quantum circuits (PQCs), achieves competitive performance with far fewer computational resources. For instance, the team used a quantum RNN to classify movie reviews as positive or negative, achieving results comparable to classical models with just four qubits.
“This result is notable for two reasons: it shows that quantum models can achieve competitive performance using a much smaller vector space, and it demonstrates the potential for significant energy savings in the future of AI,” the team writes.
The company also collaborated with Amgen to apply quantum techniques to peptide classification, a critical task in designing therapeutic proteins. Using its System Model H1 quantum processor, Quantinuum achieved performance comparable to classical systems, marking a significant step toward practical applications in computational biology.
Quantum Transformers and Tensor Networks
Quantinuum is also exploring the potential of quantum transformers, a model architecture that has revolutionized classical NLP.
First, transformers are a type of machine learning model designed to process and understand large amounts of text by focusing on the relationships between words in a sentence, enabling applications like translation and text generation. Transformers are the model that helps power large language models (LLMs) like ChatGPT.
While classical transformers rely on parallelism provided by GPUs, the quantum version, named “Quixer,” is optimized for quantum hardware. In initial studies, Quixer demonstrated competitive results on realistic language modeling tasks.
The company is investigating quantum tensor networks for NLP. Tensor networks efficiently represent high-dimensional data and are well-suited to the structure of quantum systems.
One way to think of tensor networks is to imagine them as a filing system for complex information. Instead of storing a massive pile of scattered papers, which would be the data points, in one place, tensor networks break the information into smaller, manageable folders, called tensors, and connect them with labeled drawers.
The team’s experiments using these networks achieved comparable performance to classical baselines, making them a scalable option for NLP tasks.
They write: “Since quantum theory is inherently described by tensor networks, this is another example of how fundamentally different quantum machine learning approaches can look – again, there is a sort of “intuitive” mapping of the tensor networks used to describe the NLP problem onto the tensor networks used to describe the operation of our quantum processors.”
Progress on these areas is going to be essential in order for quantum computing to have its “Chat GPT” moment.
Energy Efficiency and Future Potential
According to the blog, one of the most promising aspects of quantum AI is its potential for energy efficiency. Quantinuum recently published results showing that its quantum system consumed 30,000 times less energy than a classical supercomputer when performing a random circuit sampling task. The team anticipates similar energy savings as quantum AI models scale up.
Another advantage lies in the reduced number of parameters required for quantum models. Classical systems often rely on billions of parameters, driving up computational costs. Quantum models, by contrast, achieve similar performance with far fewer parameters, thanks to their ability to leverage quantum mechanics.
The team acknowledge that these are early days in the creation of quantum AI, but they are piling up a mound of evidence not only of its advantages, but how to tap into those benefits, such as making AI more energy efficient and more sustainable.
The team reports to Founder and Chief Product Officer Ilyas Khan who said: “We have no doubt that we are still at the early stages of rolling out complete systems, but the evidence as it relates to energy efficiency alone are worth getting excited about. As we have previously mentioned we are running very fast towards being able to harness quantum computers with Ai systems and classical hardware, specifically HPC, to deliver material value in the short term. In order to do this we need quantum computers that cannot be simulated classically, and we are the only firm who possesses such a machine”
Looking Ahead
Quantinuum views its work as the beginning of a transformative era for AI. As quantum hardware improves, the company expects quantum AI models to complement or even replace classical systems. By combining quantum properties like superposition and entanglement with machine learning, these models could tackle complex problems more efficiently and sustainably.
Because hardware improvements are necessary for quantum AI to advance, the ability to tap Quantinuum’s ever-advancing hardware assets offers the company a direct advantage, they write.
“The work being done by Quantinuum reflects the start of the next chapter in AI,” the post concludes. Additionally Khan stated “we have been quietly signalling the dramatic shift in the impact of Quantum computing on Ai and the advent of Gen QAi will be exciting and impactful”
With advances in quantum NLP and energy-efficient computing, tomorrow’s AI systems will likely rely on quantum technology, according to the team, adding. “The future of AI now looks very much to be quantum.”
For a deeper look, read the company blog post.