Written by Raiyan Rizwan, Alexa Ramirez, and Elias Lehman
In Partnership with Quantum Computing at Berkeley
“The key distinction between classical and quantum computing, beyond the physics governing their units of information, is how they process information”
Headlines about the fascinating problems that quantum computers are capable of solving hint at the fundamental difference between these computers and their classical counterparts. Quantum and classical computing are two different approaches to processing and manipulating information–they are intended to do the same things, like simulations, arithmetic, and analysis. The difference is in the unit of information that is processed on said computer. Classical computing refers to the traditional approach to computing that relies on classical physics and the manipulation of bits. Quantum computing, on the other hand, is a newer field that involves the use of quantum-mechanical phenomena, such as superposition and entanglement, to perform computing.
The key distinction between classical and quantum computing, beyond the physics governing their units of information, is how they process information.
In classical computing, bits are represented by one of two values: 0 or 1. These are then manipulated using logical operations, such as AND, OR, and NOT, to perform various tasks. Quantum computing, on the other hand, uses quantum bits, or qubits, which can represent both 0 and 1 with some probability of each through the principle of superposition, as we will explore in section 2.2. But, the probabilistic character of qubits alone is not at the root of quantum computing applications – it’s the ability to create dependencies, formally known as entanglement, between quantum states which allows for the elimination of possible outcomes faster. We call the elimination of a state’s probability interference. The advantage of quantum computers is leveraged strongest when these principles of quantum mechanics are working their hardest.
Another difference between classical and quantum computing is how they handle errors.
“I would not call [entanglement] one but rather the characteristic trait of quantum mechanics, the one that
enforces its entire departure from classical lines of thought.”
– Erwin Schrödinger