HP Chief Architect Recalibrates Expectations of Practical Quantum Computing’s Arrival From Generations to Within a Decade

HP Quantum
HP Quantum
Xpanse Xpanse

Insider Brief

  • Hewlett Packard Labs is re-engaging in quantum computing to tackle real-world industrial problems using a holistic hybrid approach with partners, according to the company’s chief architect.
  • They have reduced the computational requirements for complex chemical modeling, such as benzine, from needing 100 million qubits over 5,000 years to 1 million qubits over one year.
  • Hewlett Packard Labs’ advances could make practical solutions feasible within the next 10 years instead of generations.
  • Significant challenges remain in scaling qubits and managing error rates, but there’s potential for quantum computing to intersect with AI, possibly offering novel acceleration methods for machine learning tasks.
  • Image: HP

Hewlett Packard Labs is re-engaging with quantum computing to tackle practical industrial problems, focusing on simulating quantum systems to address challenges in chemistry and physics, according to that company’s chief architect.

After putting an earlier quantum project on the back-burner over a decade ago, the company is leveraging its supercomputing capabilities to make quantum computing more accessible and applicable.

“We’re not going to dust off our old nitrogen-qubit project—although that work is still ongoing and is one of about six potential quantum modalities,” Kirk Bresniker, Chief Architect at Hewlett Packard Labs, told Frontier Enterprise. “Instead, we’re focusing on a new opportunity.”

Approximately 18 years ago, Hewlett Packard Labs began exploring a low-level qubit using a nitrogen vacancy in a diamond lattice, Bresniker told the technology news site. The process involved implanting a nitrogen atom into a lab-grade diamond chip to create an artificial atom that could function as a qubit. However, the project was shelved 12 years ago due to unclear paths to enterprise value.

“Again, Hewlett Packard Labs has always focused on linking research with business outcomes,” Bresniker said in the interview with Frontier Enterprise. The team shifted its focus to photonics, recognizing the critical role of efficient data movement in enterprise applications.

The company’s acquisition of SGI and Cray four years ago expanded its role in supercomputing, leading to renewed interest in quantum computing from customers.

“As we designed and delivered exascale supercomputers to customers from those acquisitions, they expressed interest in what we were doing and asked, ‘For the next system, where do we integrate the qubits?’” Bresniker said.

Hewlett Packard Labs is now adopting a holistic co-design approach, partnering with other organizations developing various qubits and quantum software. The aim is to simulate quantum systems to solve real-world problems in solid-state physics, exotic condensed matter physics, quantum chemistry, and industrial applications.

“What is it like to actually deliver the optimization we’ve been promised with quantum for quite some time, and achieve that on an industrial scale?” Bresniker posed. “That’s really what we’ve been devoting ourselves to—beginning to answer those questions of where and when quantum can make a real impact.”

One of the initial challenges the team tackled was modeling benzine, an exotic chemical derived from the benzene ring. “When we initially tackled this problem with our co-design partners, the solution required 100 million qubits for 5,000 years—that’s a lot of time and qubits,” Bresniker told Frontier Enterprise. Considering current quantum capabilities are in the tens or hundreds of qubits, this was an impractical solution.

By employing error correction codes and simulation methodologies, the team significantly reduced the computational requirements.

“We reduced that requirement to 1 million qubits for one year—a 500,000x reduction,” he said. “Now, instead of talking about a solution that might take generations, we’re looking at something that could happen in the next 10 years.”

The approach involves breaking down problems and assigning different parts to the most suitable computational resources.

“Some parts might be perfect for a GPU, some for a superconducting qubit, and others for a trapped ion qubit,” Bresniker explained. “By combining quantum processing units as accelerators with traditional classical supercomputing, we create a hybrid environment—that’s the essence of holistic co-design.”

Despite the progress, several roadblocks hinder the commercialization of quantum computing.

“The real question with quantum is coherence time—how long can a qubit remain functional before it goes, ‘poof!’ Then, there are the error rates,” Bresniker said in his discussion with Frontier Enterprise. Managing the probabilistic nature of qubits and their tendency to interfere with each other due to entanglement adds to the complexity.

Scaling the number of qubits is another significant challenge.

“You might have 32 qubits inside a vacuum chamber the size of a phone, but we don’t need 32; we need a million,” he said.

Technologies like superconducting qubits require cooling to four millikelvin in a dilution refrigerator, presenting substantial engineering hurdles.

“Each technology presents significant engineering challenges, and that’s even before we consider the integration of control systems over the course of a calculation, which could still take weeks, months, or even years,” Bresniker told Frontier Enterprise.

According to Bresniker, one area of particular interest — and one that is under active research — is the potential impact of quantum computing on artificial intelligence,.

“You can do linear algebra on a quantum system, and there are HHL algorithms that support this. Now, is it better? That’s still an open research question,” he said.

He added the escalating costs and resources required for training large AI models.

“If you draw that curve, showing the linear increase in parameter count causing an exponential rise in resources needed to train a model, you can see that in about three or four years, the cost to train a single model could surpass what we currently spend on global IT,” Bresniker said. “That’s unlikely to happen—we’ll hit a hard ceiling.”

The future may involve making GPUs more efficient, developing application-specific accelerators, or possibly utilizing quantum computing.

Another intriguing area of exploration is the intersection of quantum computing and machine learning.

“What’s interesting for us is something we might call quantum machine learning, but it’s not about using quantum processors to run today’s conventional machine learning algorithms,” Bresniker said in the interview. “It’s more about asking, ‘Can we train a machine learning algorithm to model quantum systems—systems that obey the laws of quantum mechanics—without actually needing to create a qubit?’”

For Hewlett Packard Labs, this could represent one of the most promising intersections between quantum computing and AI.

“For us, that could be one of the more interesting intersections,” he said.

Matt Swayne

With a several-decades long background in journalism and communications, Matt Swayne has worked as a science communicator for an R1 university for more than 12 years, specializing in translating high tech and deep tech for the general audience. He has served as a writer, editor and analyst at The Quantum Insider since its inception. In addition to his service as a science communicator, Matt also develops courses to improve the media and communications skills of scientists and has taught courses. [email protected]

Share this article:

Keep track of everything going on in the Quantum Technology Market.

In one place.

Related Articles

Join Our Newsletter