NTT Focuses on Light For Cleaner, Scalable Path to Quantum Computing

Japan quantum
Japan quantum
Hub Hub

Insider Brief

  • NTT is advancing a light-based approach to quantum computing as a cleaner, more scalable alternative to today’s energy-intensive architectures.
  • The company is developing photonics-driven systems that use optical circuits to reduce power consumption and improve stability compared with superconducting designs.
  • NTT’s strategy positions photonic quantum technologies as a pathway toward commercially viable quantum platforms that can scale without the heavy infrastructure requirements of current systems.

Japan is advancing a room-temperature quantum computing strategy built on light rather than electricity, a move that researchers say could reshape the race for faster, cleaner and more scalable machines.

Japanese technology conglomerate NTT, working with quantum developer OptQC, is promoting optical quantum computing as a lower-power alternative to the ultra-cold, energy-hungry systems pursued in the U.S. and China, according to reporting by Fast Company. The companies argue that photons — the basic particles of light — can provide the stability, speed and manufacturability needed for computers that may one day outperform today’s most advanced artificial intelligence.

Japan is positioning this photonic-first approach as part of a broader national strategy. While the United States and China build increasingly complex hardware that depends on deep-cryogenic refrigeration and exotic materials, Japan is framing its model as a simpler, more energy-efficient route. The effort aligns with growing public pressure to reduce the power consumption of data centers, AI infrastructure, and next-generation telecom networks.

Responsive Image

NTT has spent the past several years developing an optical architecture under its Innovative Optical and Wireless Network program, a long-term initiative to overhaul how computing and communication infrastructure is built. The central idea is straightforward: use light instead of electrical signals to perform calculations. Because photons generate little heat and travel without resistive losses, NTT contends that optical systems can operate at room temperature without the massive cooling equipment found in other quantum machines.

NTT is now moving this work into a coordinated five-year push with OptQC. According to OptQC, the plan begins with joint technical studies, early hardware-software codesign, and identifying initial use cases with outside partners. The companies expect to build full development environments in year two before moving into enterprise testing in year three.

The final phase, scheduled for the end of the decade, focuses on scaling machines into the million-qubit range and preparing them for commercial deployment, according to Shingo Kinoshita, SVP and head of R&D planning at NTT.

“The 2030 vision of 1 million qubits is not just about performance, it’s about redefining how we align advanced computing with planetary limits,” Kinoshita said, as reported by Fast Company. “In the near term, as we aim for 10,000 qubits by 2027, the first impact will be within NTT’s own communications infrastructure.”

The roadmap rests on a series of recent scientific advances within Japan’s research ecosystem. Over the past year, NTT and a coalition including RIKEN, Fixstars Amplify, the University of Tokyo and the National Institute of Information and Communications Technology demonstrated what they describe as the world’s first general-purpose optical quantum computing platform that operates without external cooling. The system fits inside a single room, a milestone that even some of the world’s most advanced superconducting and atomic quantum systems have not yet reached.

Japan’s progress arrives as demand for computational power continues to surge. Modern AI models require enormous processing capacity for simulation, pattern recognition, optimization, and training. Classical semiconductor-based systems rely on electrons moving through circuits, a process that generates heat and demands large power budgets. NTT and OptQC argue that photonic quantum machines can act as accelerators for both AI and future telecom networks such as 6G by performing certain high-dimensional calculations more efficiently.

The core technical challenge remains scale. Quantum computing depends on qubits — the quantum analog of bits — which can exist in multiple states at once. This property allows quantum machines to explore many possible answers simultaneously, a capability that grows exponentially as more qubits are added. Researchers generally agree that machines with thousands of high-quality qubits are required to surpass today’s best supercomputers in meaningful ways. Reaching a commercially practical level likely requires around one million logical qubits, a figure that accounts for the redundancy needed to correct the constant errors qubits make.

NTT claims that photons change the equation by avoiding some of the thermal and material constraints faced by matter-based systems such as superconducting qubits, trapped ions, and neutral atoms. Photons do not require refrigeration to near-absolute zero. They are less sensitive to environmental noise. And NTT argues that optical components can be manufactured at scale using existing photonics supply chains. Company researchers told Fast Company that developing reliable quantum light sources and improving precision fabrication yields will be critical to hitting their targets.

Still, experts caution that the challenges of photonic quantum computing are significant. Fast Company reports that industry researchers point to the need for near-perfect photon sources and detectors, as well as mechanisms that allow photons to interact with one another. These interactions do not happen naturally and require complex optical elements to simulate. Building fault-tolerant machines with thousands of qubits would demand engineering precision well beyond what has been demonstrated so far.

Japan’s strategy stands out partly because it avoids the power-intensive infrastructure common in U.S. and Chinese systems. Superconducting quantum computers, such as those developed by IBM and Google, must operate inside dilution refrigerators that cool hardware to colder temperatures than outer space. Neutral-atom platforms and trapped-ion approaches require ultra-high vacuum chambers, laser arrays, and electromagnetic traps. All of these systems face difficult manufacturing and scaling challenges, even as they demonstrate increasing performance on small-scale problems.

NTT and OptQC argue that optical systems simplify many of these constraints. By using photons as qubits and performing logic through controlled interference, the hardware can operate in more familiar settings. The companies also point to Japan’s long-standing expertise in optical components, fiber networks, and precision manufacturing as strategic advantages.

If successful, the photonic roadmap may help Japan carve out a distinctive role in the global quantum race. Rather than competing head-to-head with cryogenic and atomic systems, Japan is promoting an approach centered on energy efficiency, manufacturability, and integration into existing telecom infrastructure. The companies see potential applications in drug discovery, materials development, financial optimization, and climate modeling — all areas where quantum speedup could deliver value long before general-purpose machines arrive.

The plan also reflects a growing debate over how quantum computing should evolve. Some researchers advocate for specialized, domain-specific machines that target optimization or simulation. Others push for universal quantum computers capable of performing any quantum algorithm. NTT’s roadmap suggests a middle path: build room-temperature optical machines that can scale quickly, then apply them to enterprise problems that require large numbers of qubits but may not need full universality.

For now, the company must demonstrate that its claims about manufacturability, stability, and scale can survive real-world demands. The timeline is ambitious. No quantum computing platform — photonic or otherwise — has yet reached fault-tolerant operation, let alone the million-qubit threshold needed for commercial use. And experts say optical systems face unresolved physics and engineering questions that could slow momentum.

But Fast Company reports that NTT believes the energy costs of AI and telecom networks will force the industry to rethink its dependence on matter-based architectures. The company is betting that light, not electrons, will offer a sustainable path forward.

“Today, the energy footprint of AI is emerging as a global challenge. Optical quantum computing processes information with light, enabling dramatically lower power consumption and scalable qubit growth through optical multiplexing,” Kinoshita told Fast Company.

IAC IAC

Greg Bock

Greg Bock is an award-winning investigative journalist with more than 25 years of experience in print, digital, and broadcast news. His reporting has spanned crime, politics, business and technology, earning multiple Keystone Awards and a Pennsylvania Association of Broadcasters honors. Through the Associated Press and Nexstar Media Group, his coverage has reached audiences across the United States.

Share this article:

Keep track of everything going on in the Quantum Technology Market.

In one place.

Related Articles