Zurich Zurich

Haiqu And HSBC Research Team Encodes ‘Largest Financial Distributions to Date’ on Quantum Computers 

Haiqu research
Haiqu research
Quantum Source Quantum Source

Insider Brief

  • Researchers from Haiqu and HSBC report on a method to encode large-scale financial data into quantum circuits using shallow and efficient designs.
  • The technique leverages matrix product states and Tensor Cross Interpolation to efficiently handle heavy-tailed financial distributions like Lévy distributions.
  • Their approach achieved encoding on IBM quantum processors with up to 64 qubits, demonstrating potential for applications in finance and other industries.

A team of Haiqu-led researchers have developed a new approach to encoding complex financial data into quantum circuits, pushing the limits of IBM’s quantum processors, according to a paper published on the pre-print server arXiv. The team reports that the method, which focuses on shallow and efficient circuit designs, could advance the use of quantum computing in finance and other industries.

In a recent LinkedIn post on the paper, the team writes: “Quantum computing cannot achieve wide utility in the near term until we can efficiently load classical data onto quantum hardware. Now, we can.”

Key Findings

The study, conducted in collaboration with HSBC, demonstrates how quantum processors can encode large-scale financial distributions like Lévy distributions, which model heavy-tailed and skewed data common in financial markets. In other words, data in the financial industry can include extreme events or outliers that happen more often than normal and are unevenly distributed, for example, sudden market crashes or big price jumps. These distributions are vital for risk assessment and portfolio optimization, where conventional models like Gaussian distributions often fail.

Responsive Image

The researchers employed matrix product states (MPS), a mathematical framework that simplifies the representation of complex data. By improving on existing MPS techniques, they developed a scalable algorithm that produces circuits with a linear number of quantum gates, making them executable on near-term quantum devices.

Their work represents the largest financial data encoding performed on IBM quantum processors to date, achieving up to 64 qubits.

Opening Door to Solving Difficult Financial Analysis With Quantum Computers

The ability to efficiently encode financial distributions into quantum circuits opens the door to quantum-enabled risk management, high-frequency trading and derivatives pricing. Lévy distributions, known for capturing extreme market events, can now be modeled more precisely, improving the accuracy of financial predictions and decision-making.

Beyond finance, the methodology could apply to other industries requiring high-dimensional data processing, such as climate modeling and healthcare analytics. Broadly, the researchers also highlighted the scalability of their approach, which could facilitate quantum computation of utility-scale problems across various sectors.

They write: “We emphasize that our algorithm does not require the explicit storage of the input, i.e. the potentially exponentially many discretized function values, in memory, but is able construct the circuit directly by locally sampling from the probability density function (PDF) trough the use of Tensor Cross Interpolation (TCI). Thus our method is both classical compute- and memory-efficient and scalable to very large qubit numbers.”

MPS Techniques

The study builds on MPS techniques, which approximate input functions with shallow quantum circuits. According to the researchers, MPS organizes data into smaller, manageable components, reducing the computational load. The team analyzed how the smoothness and localization of input functions affect the complexity of MPS representations, which helped them to design circuits with reduced depth.

A key innovation was the use of Tensor Cross Interpolation (TCI), a method that avoids storing large datasets in memory. Instead, TCI samples data locally, making the process memory-efficient and scalable. The researchers tested their circuits on IBM quantum devices, including the ibm_torino and ibm_marrakesh processors, and validated their results using statistical tests such as the Kolmogorov-Smirnov test.

Real-World Results

The generated circuits performed well under real-world conditions. On devices with up to 25 qubits, the circuits accurately represented the target distributions, passing statistical benchmarks. For larger circuits with up to 64 qubits, the researchers observed qualitative agreement with theoretical expectations, despite the limitations of current quantum hardware.

The circuits were particularly effective in modeling Lévy distributions, which are commonly used to capture market phenomena like volatility clustering and extreme price movements. By leveraging their shallow circuit design, the team demonstrated the feasibility of encoding complex financial data on noisy intermediate-scale quantum devices.

Limitations

While promising, there’s still work to do, the researchers report. Current quantum processors are limited by noise, which can degrade the accuracy of results. The study found that deeper circuits, while theoretically more accurate, were susceptible to noise, emphasizing the need for shallow designs on today’s hardware.

Additionally, the method’s reliance on MPS and TCI techniques requires careful tuning to balance accuracy and computational resources. The researchers acknowledged that further work is needed to optimize these trade-offs and extend their methods to multivariate functions, which are common in real-world applications.

Future Directions

Based on these challenges — and ideas for future work — the researchers identified several avenues for further exploration. Extending their approach to multivariate functions could enable the modeling of more complex systems, such as multi-asset portfolios or climate simulations. They also highlighted the potential for new tensor network geometries to improve circuit efficiency and accuracy.

Another key area of focus is the impact of hardware noise. As quantum processors continue to evolve, understanding and mitigating noise effects will be critical for the practical implementation of these circuits.

Finally, the researchers aim to refine their methods for larger-scale problems, paving the way for commercial applications in finance and beyond. With continued development, the team suggests that this approach could become a cornerstone of quantum-enhanced data processing.

The research team included: Vladyslav Bohun, Illia Lukin, Mykola Luhanko, Mykola Maksymenko, and Maciej Koch-Janusz, all affiliated with Haiqu, Inc. Georgios Korpas represents HSBC Lab in Singapore, the Czech Technical University in Prague and the Athena Research Center in Greece. Philippe J.S. De Brouwer contributes from HSBC in Krakow, Poland, and Maciej Koch-Janusz also holds a position at the University of Zürich.

ArXiv is a pre-print server, which means the study has yet to be officially peer-reviewed. Because progress in technology moves swiftly, researchers often use arXiv as an early informal peer-review. For a more technical investigation of the research, please read the paper on arXiv.

Matt Swayne

With a several-decades long background in journalism and communications, Matt Swayne has worked as a science communicator for an R1 university for more than 12 years, specializing in translating high tech and deep tech for the general audience. He has served as a writer, editor and analyst at The Quantum Insider since its inception. In addition to his service as a science communicator, Matt also develops courses to improve the media and communications skills of scientists and has taught courses. [email protected]

Share this article:

Keep track of everything going on in the Quantum Technology Market.

In one place.

Related Articles

Join Our Newsletter