Guest Post by Mark Haas, CEO and Co-Founder, Association for Enterprise Growth (AEG)
Years ago, I facilitated a global science and technology R&D gathering for a branch of the US military. Scientists and technologists from across the world each shared emerging research and applications in fields ranging from advanced materials to new forms of energy.
Some participants described how signals would move, how sensors would interpret them, how data would cascade through the system. As explanations unfolded, I realized that no one person in the room, whether experienced officers, seasoned engineers, or technology researchers, could fully grasp the whole picture. Each understood their piece, but the system itself felt bigger than anyone’s comprehension.
That memory returns as I watch the rise of quantum technologies. They represent the next stage of human ingenuity, astonishing in potential, unsettling in scope. And like that R&D gathering, they raise a profound question: how do we make strategic choices when the tools themselves stretch beyond human comprehension?

That’s a glimpse of what happens when technology starts to slip outside the bounds of ordinary human understanding. We could still operate the system, but we were no longer fully in control.
From the lever to the wheel, from fire to the steam engine, every great leap in technology expanded human capacity while leaving intact our sense of agency. We pulled the lever, turned the wheel, fed the fire, and shoveled coal. Cause and effect were visible and intuitive. We acted, and the world responded.
Quantum technologies, especially when combined with artificial intelligence, change that relationship. They promise astonishing power: encryption broken in moments, molecules or weather simulated with breathtaking accuracy, logistics problems solved as fast as they can be defined. But they also introduce something unsettling: outcomes produced by processes that not even experts can fully explain, much less ordinary citizens.
This is not just complexity. It is a potential loss of agency, the creeping sense that we are no longer directing the tool, but merely watching it work and hoping the outcome serves us well.
When Technology Outpaces Trust
History shows what happens when ethical foresight lags behind technical capability.
- Stem cell research in the 1990s ignited public outcry because ethical questions were addressed late, not early.
- AI deployment today faces adoption hurdles under the weight of bias, opacity, and algorithmic distrust.
- Nuclear power, for all its advances in cost and safety, remains haunted by public distrust rooted in both ethical and safety concerns.
And sometimes, a technology that seemed universally promising turned out to be dangerous or unwanted. Consider Australia in the 1930s. To solve a pest problem in sugarcane fields, scientists introduced cane toads from South America. The toads multiplied uncontrollably, poisoned native predators, and devastated ecosystems. What was intended as a clever biological fix became an ecological nightmare. The lesson is clear: interventions without foresight can spiral beyond human control.
Quantum runs the same risk if ethics is sidelined. What looks like progress in a lab may have cascading consequences in society.
Regulation: Trust or Acceleration?
In theory, regulation exists to build trust: to ensure new technologies are safe, accountable, and transparent. The FDA’s clinical trial system is designed not only to protect patients but to reassure the public that drugs were tested before release. Regulation as trust-building makes adoption durable.
But regulation can also be designed primarily to accelerate adoption. Nuclear power standards, for example, were often framed to support rapid rollout rather than to engage with public fears about waste or safety. Similarly, in AI today, some regulatory proposals arguably aim more to reassure investors than to create meaningful accountability. The result is that compliance boxes are ticked, but skepticism remains and, in some cases, increases.
For quantum, regulation must be more than a bureaucratic green light. It must address the core unease people feel about losing agency. Otherwise, it risks being perceived as rubber-stamping complexity the public doesn’t understand.
Ethical Principles for Emerging Technology
Based on past cycles, four ethical principles stand out for quantum and similar frontiers:
- Transparency with Limits. No one expects every user to understand quantum entanglement or error correction. But stakeholders need explanations that are accessible and honest. Partial transparency beats mystical opacity.
- Equitable Access. Technologies that widen divides provoke backlash. Where possible, creating open research pipelines, inclusive education, and shared infrastructure builds legitimacy.
- Security First. If quantum breaks encryption before defenses are ready, trust collapses. Post-quantum cryptography must be treated as urgent infrastructure, not a niche concern.
- Adaptive Oversight. Governance must evolve with the technology. Static regulation will lag, as evidenced recently with other technologies. Flexible oversight, built with input from scientists, policymakers, and ethicists, creates a living trust framework.
These principles are not idealistic but pragmatic; they prevent the cane toad problem, where something meant to help instead becomes uncontrollable, and remind us that the real public conversation is not only “Can we build this?” but also “What happens when we do?”
Quantum Communication: Beyond Human Intuition
Among the most unsettling areas of quantum is communication. Entangled particles transmit information in ways that defy ordinary intuition. Messages can be exchanged with absolute security, yet to the human mind the process feels like gibberish: signals flickering in a universe that refuses to explain itself.
The GibberLink communication protocol allows AI engines to exchange information that humans cannot follow. It unsettles people in the same way as overhearing people speaking in a foreign language. You know meaning is being exchanged, but you don’t know what is being said.
Quantum communication may create the same unease. When humans can no longer trace how information is sent or how decisions are made, trust will need to be earned differently. Many people won’t accept “just trust us” from technologists. They’ll demand frameworks that demonstrate responsibility even when comprehension is limited.
Ethics as Strategy
Ethics in quantum is not about slowing innovation. It is about building the conditions for innovation to thrive and be accepted. Organizations that integrate foresight early will move faster, not slower, because they won’t be paralyzed by backlash or blindsided by regulation.
That means:
- Making post-quantum cryptography a visible commitment, not a hidden fix.
- Designing for equitable access, through public-private partnerships and inclusive workforce pipelines.
- Embedding transparency frameworks that make processes explainable to stakeholders, even if not intuitive to users.
- Balancing national security with scientific collaboration so quantum is not reduced to an arms race narrative.
The lesson from past technologies is clear. Those who ignored ethics lost legitimacy. Those who anticipated it built trust — and trust accelerated adoption.
A Human-Centered Future
The quantum community has a choice. It can see itself as merely building machines — faster computers, tighter cryptography, more precise sensors. Or it can see itself as building a relationship with society. The latter requires humility: acknowledging that as tools grow more powerful, they can also grow less transparent. People will not accept what they cannot trust, no matter how miraculous the output.
The lever, the wheel, the fire, and the steam engine all amplified human power while leaving agency intact. Quantum communication and computation, especially when fused with AI, challenge that lineage. They move us toward tools that deliver answers without explanations, outcomes without visible levers. That is not just a scientific leap, it’s a cultural one.
This is why strategy must expand beyond physics and engineering. It must include ethics, transparency, and foresight. Because quantum may be the most powerful set of tools humanity has ever built. But tools alone don’t build futures, trust does. Organizations that weave ethics into their quantum strategies now will own not just the technology, but the legitimacy to use it.
About Mark Haas
Mark Haas is the Founder of Haas Strategy Solutions, a management consulting firm that helps science-driven and midmarket organizations turn technical breakthroughs into strategic clarity. He has advised NIH, US Navy R&D, biomedical and clinical research groups, energy ventures, and international development institutions. With degrees in biology and public policy and more than 30 years advising CEOs and boards, he specializes in guiding organizations through inflection points where technology outpaces strategy. Twenty years as an Ethics Officer, he brings an independent, ethical, and cross-sector perspective to complex challenges.