Quantum computing is an area of computer science that explores the possibility of developing computer technologies based on the principles of quantum mechanics. It is still in its early stages but has already shown promise for significantly faster computation than is possible with classical computers. In this article, we’ll take a look at the history of quantum computing, from its earliest beginnings to the present day.
Why Bother Learning About Quantum Computing History?
History provides important context on how humanity has previously made scientific discoveries. There are many theories on precisely how scientific understanding has progressed over time but, however you view it, looking backwards does give us a fresh view on how to view the present.
Everything is theoretically impossible, until it is done. One could write a history of science in reverse by assembling the solemn pronouncements of highest authority about what could not be done and could never happen.
— Robert Heinlein, American Science Fiction Author
By placing the development of quantum computers in the wider context of the history of computing, I hope to provide the reader with a sense of the enormity of the engineering challenges that we overcome in the past and provide a moderated but positive view on the outlook for our collective ability to develop useful quantum computers. This is deliberately a rapid and non technical article.
You may also like:
- Quantum Journey From the Search Engine to Google Sycamore
- 7 Quantum Computing Books to Read in 2022 [Ranked & Reviewed]
Short history of classical computing
If you really want to go back in time you need to go back to the concept of an abacus. However, the computer in the form we broadly recognise today stems from the work of Charles Babbage (1791-1871) and Ada Lovelace (1815-1852). Babbage designed the Analytical Engine, and whilst he didn’t finish building the device, it has since been proven that it would have worked. Indeed iits logical structure is broadly the same as used by modern computers. Lovelace was the first to recognise that the machine had applications beyond pure calculation, and published the first algorithm intended to be carried out by such a machine. She is consequently widely regarded as the first to recognise the full potential of computers and one of the first computer programmers.
In 1890 Herman Hollerith designed a punch card system to calculate the 1880 census, accomplishing the task in just in record time and saving the government $5 million. He established the company that would ultimately become IBM.
In 1936 Alan Turing presented the idea of a universal machine, later called the Turing machine, capable of computing anything that is computable. The central concept of the modern computer was based on his ideas.
We provide a development of high level summary of the development of the core technology of computers following this, to provide a wider contextual lens on the development of quantum computing. Quantum Computing is seen by many as the next generation of computing. There are many ways of dividing up the eras of computing and the history of quantum computing, but we think this is most instructive.
Classical computers first used bits (zeros and ones) to represent information. These bits were first represented with physical switches and relay logic in the first electro-mechanical computers. These were enormous, loud feats of engineering and it was clear that a better way of representing bits were needed.
Vacuum tubes (1940s – 1950s)
Vacuum tubes were used in the late 1940s as a way of controlling electric current to represent bits. These were unwieldy, unreliable devices that overheated and required replacing. The first general – purpose digital computer, the Electronic Numerical Integrator and Computer (ENIAC) was built in 1943. The computer had 18,000 vacuum tubes. Computers of this generation could only perform single task, and they had no operating system.
Transistors (1950s onwards)
As anyone who still owns a valve guitar amp knows, vacuum tubes are rather temperamental beasts. They use a lot of energy, they glow and they regularly blow up. In the late 1940s the search was on for a better way of representing bits, which led to the development of transistors. In 1951 the first computer for commercial use was introduced to the public; the Universal Automatic Computer (UNIVAC 1). In 1953 the International Business Machine (IBM) 650 and 700 series computers made were released and (relatively) widely used.
Integrated circuits (1960s to 1970s)
The invention of integrated circuits enabled smaller and more powerful computers. An integrated circuit is a set of electronic circuits on one small flat piece of semiconductor material (typically silicon). These were first developed and improved in the late 1950s and through the 1960s. Although Kilby’s integrated circuit (pictured) was revolutionary, it was fashioned out of germanium, not silicon. About six months after Kilby’s integrated circuit was first patented, Robert Noyce, who worked at Fairchild Semiconductor recognized the limitations of germanium and creating his own chip fashioned from silicon. Fairchild’s chips went on to be used in the Apollo missions, and indeed the moon landing.
Microprocessors and beyond (1970s onwards)
In the 1970s the entire central processing unit started to be included on a single integrated circuit or chip and became known as microprocessors. These were a big step forward in facilitating the miniaturization of computers and led to the development of personal computers, or “PCs”.
Whilst processing power has continued to advance at a rapid pace since the 1970s, much of the development of the core processors are iterations on top of the core technology developed. Today’s computers are a story of further abstraction including the development of software and middleware and miniaturization (with the development of smaller and smaller microprocessors).
Further miniaturization of components has forced engineers to consider quantum mechanical effects. As chipmakers have added more transistors onto a chip, transistors have become smaller, and the distances between different transistors have decreased. Today, electronic barriers that were once thick enough to block current are now so thin enough that electrons can tunnel through them (known as quantum tunnelling). Though there are further ways we can increase computing power avoiding further miniaturization, scientists have looked to see if they can harness quantum mechanical effects to create different kinds of computers.
Quantum Computing Timeline
Quantum Computing Timeline is clearly constantly evolving and we welcome edits and additions (please email [email protected])
Quantum Mechanics as a branch of physics began with a set of scientific discoveries in the late 19th Century and has been in active development ever since. Most people will point to the 1980s as the start of physicists actively looking at computing with quantum systems.
1982: History of quantum computing starts with Richard Feynman lectures on the potential advantages of computing with quantum systems.
1985: David Deutsch publishes the idea of a “universal quantum computer”
1994: Peter Shor presents an algorithm that can efficiently find the factors of large numbers, significantly outperforming the best classical algorithm and theoretically putting the underpinning of modern encryption at risk (referred to now as Shor’s algorithm).
1996: Lov Grover presents an algorithm for quantum computers that would be more efficient for searching databases (referred to now as Grove’s search algorithm)
1996: Seth Lloyd proposes a quantum algorithm which can simulate quantum-mechanical systems
1999: D-Wave Systems founded by Geordie Rose
2000: Eddie Farhi at MIT develops idea for adiabatic quantum computing
2001: IBM and Stanford University publish the first implementation of Shor’s algorithm, factoring 15 into its prime factors on a 7-qubit processor.
2010: D-Wave One: first commercial quantum computer released (annealer)
2016: IBM makes quantum computing available on IBM Cloud
2019: Google claims the achievement of quantum supremacy. Quantum Supremacy was termed by John Preskill in 2012 to describe when quantum systems could perform tasks surpassing those in the classical world.
Today’s quantum computers are described by some as sitting around the vacuum tube era (or before) as we outlined above in the previous section. This is because we are only just at the cusp of making useful and scalable quantum computers. This stage of development means that its not straightforward to decide on how to invest in the technology, and we cover this in the following article.
Steampunk chandeliers (IBM Quantum Computer)
When I was researching and writing this piece, two things struck me. Firstly, the history of computers has not been a clear, linear progression over the last 100 or so years. Rather it has relied on big leaps that were hard to a priori envision. Secondly, whilst we try and steer away from straight out hype at TQD, we recognise that the large engineering feats of the 1940s are a far cry from the computers we have now, driven forward by individuals with the backing of smaller budgets than we benefit from today. It leaves us quietly confident that, though the development of quantum computers will be far from clear sailing, there is a future where the steampunk chandeliers (thanks Bob Sutor) we see today will be the only the first steps in a fascinating story of scientific development.