Cookie Consent by Free Privacy Policy Generator
Search
Close this search box.
Search
Close this search box.

Why selling quantum computing is easier than selling a quantum computer

With hundreds of start-ups in the emerging field of quantum computing, alongside giants like IBM, Microsoft, Amazon and Google, the prospect of generating revenue from what is essentially a research and development endeavor is daunting. Publicly-traded quantum start-ups such as IonQ and D-Wave Systems, which do not have non-quantum lines of business to subsidize their investments, likely feel this pressure more acutely.

Monetizing quantum technologies is difficult as current systems have limited useful capabilities. Significant advancements have been made in extending coherence times, decreasing error rates and increasing qubit counts; these developments influence product road maps for future systems, but the promise of tomorrow has little impact on what is available now.

Selling quantum computing on a large scale is possible, even when quantum computers are not yet advanced enough to be used in production. Doing so requires a significantly different approach from hardware and software purveyors, as well as industry-wide coordination — and restraint — in communicating the value of quantum technologies.

For quantum to succeed, the rhetoric around it must be demystified

The general public’s understanding of quantum computers’ utility and purpose remains abysmal. Prospective buyers are moderately better-off, despite the marketing attempts of manufacturers. Press releases touting the achievement of “quantum supremacy” are as much a rite of passage as they are unhelpful to the cause; leaving alone the unfortunate implications of the word supremacy, the underlying claim has been so frequently repeated and debunked that the premise itself is a thought-terminating cliché.

Explaining the value of quantum computers to prospective customers must first demystify the science behind the technology — phrases such as Einstein’s remark calling quantum entanglement “spooky action at a distance” are neither relevant nor helpful in explaining the value of an error-corrected quantum computer. Likewise, inflating the practical ability of near-term quantum computers with subjective milestones undermines the impact that higher qubit counts, robust error correction, higher qubit fidelity and longer coherence times will ultimately deliver.

Quantum hardware manufacturers should publish as many common benchmarks as possible. Although individual metrics provide interesting data points, synthetic benchmarking allows for progress to be measured and tracked over time. Quantum and classical synthetic benchmarks alike may not show the full value or ability of a given system, but this should not be used to dismiss existing standards. Introducing novel, company-specific benchmarks while ignoring the standards that competitors use would make comparison all but impossible.

For quantum computing to be understood, it must be contextualized

Comparing successive quantum computers from one manufacturer, and between models from competing companies, is important for characterizing progress. But this alone does not convey to prospective buyers what quantum computers can do, or how they differ from classical computers. Oft-repeated practical examples — such as the difference between bits and qubits, the utility of entanglement and so on — explain how quantum computing differs from classical computing, though it is often more conceptual than concrete.

Historically, advances in classical computing have enabled cost-effective computation for different areas of mathematics. These modalities allow for new uses to which a computer can be practically applied, leading to new products and capabilities. These advances illustrate the difference — and therefore, the value — that quantum computers can provide.

Classical processors, like the CPUs in consumer and enterprise systems today, are effectively the descendants of adding machines. Since the introduction of the Intel 8086 microprocessor in 1978 various improvements have been added, including longer bus widths, faster clock speeds and floating-point arithmetic. But, independent of architecture, any given application running on a CPU is by volume mostly the same six instructions: add, subtract, load, store, compare and branch. Traditional CPUs are intentionally general purpose; they can perform almost any calculation accurately, but not necessarily quickly.

But CPUs are not particularly efficient at graphics processing, which requires higher parallelism and relies extensively on geometric calculations that CPUs are not tailored for. Demand for 3D graphics processing in business and entertainment led to the mass-market commercialization of GPUs in the mid-1990s. “Nice to have” features have been added over time, such as video encoding and decoding, texture mapping and raytracing. Nvidia’s CUDA software made its GPUs popular for general-purpose workloads, opening the same underlying hardware to new markets; comparatively, AMD’s software stack is less versatile, and adoption of AMD GPUs beyond graphics processing is less prevalent.

Artificial intelligence (AI) and machine learning workloads were the primary beneficiary of non-graphical computing on GPUs, though this has not been a perfect fit. Although these workloads can use the parallelism of GPUs, they typically rely on matrix and tensor calculus and have a stronger dependency on data locality than graphics processing. Similarly, extensions for texture mapping found in GPUs are not useful for AI or machine learning. In the mid-2010s, various approaches — such as Google’s Tensor Processing Unit and Graphcore’s Intelligence Processing Unit — emerged as hardware accelerators to rectify this.

Quantum computing is the next step in this progression of computational ability, making it the fourth pillar of computing. Quantum processors will simultaneously perform classes of applications that are impractical to calculate on classical computers. The prospect of using Shor’s algorithm to factor prime numbers in pursuit of cracking encryption is often touted in security circles, but the impact of quantum computing extends beyond that. Uses such as linear systems of equations, mathematical optimization, and boson sampling are thought to have implications for economics, engineering and pharmacology, as well as for the development of AI and machine learning generally.

For quantum computers to be useful, quantum algorithms must be developed in parallel

The capability of any computer platform is determined by the quality and quantity of its software. For quantum hardware to be useful, quantum algorithms must be developed in parallel. This requires an investment of time and money, alongside a general idea of the problems that could be addressed with a quantum computer. Crucially, what it doesn’t require is an expertise in quantum science; many manufacturer-specific and neutral tools are available to ease the adoption process.

Usefully managing these tools to ensure that success is a possible outcome requires breaking institutional problems such as internal opposition, culture clashes and budget tightening. Taking a wait-and-see approach by definition cedes any potential of a first-mover advantage. In a wider view of the potential impact of quantum computing, there is no guarantee that a problem relevant to an industry will be solved without a company putting in some effort to solve it. Or, to indulge in a truism — you miss 100% of the shots you don’t take.

Although there is no panacea to solving institutional inertia, reassuring businesses that quantum technologies are worth developing today is a vital first step.

Fortunately, developing quantum software does not require hand-stitching circuits for a specific computer. Although it is still early days, initiatives such as the QIR Alliance are developing cross-platform solutions that aim to fully use the capabilities of quantum processors from different hardware manufacturers.

Likewise, organizations looking to explore quantum computing can partner with external firms to gain quantum competency or obtain guidance in developing quantum skills internally.

 

.

James Sanders is principal analyst for quantum computing at CCS Insight  

If you enjoyed this article, be sure to check out more quantum computing news.

The Future of Materials Discovery: Reducing R&D Costs significantly with GenMat’s AI and Machine Learning Tools

When: July 13, 2023 at 11:30am

What: GenMat Webinar

Jake Vikoren

Jake Vikoren

Company Speaker

Deep Prasad

Deep Prasad

Company Speaker

Araceli Venegas

Araceli Venegas

Company Speaker

jasanders

Share this article:

Relevant

The Future of Materials Discovery: Reducing R&D Costs significantly with GenMat’s AI and Machine Learning Tools

When: July 13, 2023 at 11:30am

What: GenMat Webinar

Jake Vikoren

Jake Vikoren

Company Speaker

Deep Prasad

Deep Prasad

Company Speaker

Araceli Venegas

Araceli Venegas

Company Speaker

Keep track of everything going on in the Quantum Technology Market.

In one place.

Related Articles

Explore our intelligence solutions

Join Our Newsletter