The word "quantum" can conjure up quite a few meanings. It's a reference to quantity. It's SPECTRE reborn for the modern James Bond films. And it just sounds science-y in general thanks to quantum dot displays, quantum mechanics and quantum entanglement. Quantum computing may be the biggest buzzword of them all: it's an exciting and extremely complex technology, and the science community's Top Men have barely scratched the surface of it.
Quantum computers perform operations with qubits (quantum bits) rather than the binary bits of transistor-based computers. Qubits open up the potential for quantum computers to use sophisticated algorithms and perform calculations much faster than existing systems. There's just one catch: it will be years, or even decades, before quantum computers can operate in place of or alongside transistor-based computers.
Quantum bits are the heart of what makes quantum computing seriously cool and more than a little difficult to understand. Let's start with regular bits. They're binary--meaning they represent 1 or 0, on or off--and are used to perform calculations and represent information in computers. Qubits are not binary. Thanks to the principle of quantum superposition, quantum bits are both 0 and 1 simultaneously:
In quantum computers information may also be represented as simultaneous quantum superposition of both 1 AND 0 (quantum bits, or "qubits")! While in superposition, qubits interact/compute with other qubits via nonlocal quantum entanglement. Eventually each qubit "collapses" from its quantum schizophrenia and chooses either 1 or 0 as its classical form.
Quantum superposition states that a quantum element will only return a result in one state when measured. Here a second principle comes into play: quantum entanglement. Entanglement is actually how we "measure" the interaction of electrons, molecules, photons, and other particles:
". . .quantum entanglement. . .correlates the states of different qubits with one another. When we perform operations and measurements on a qubit that is entangled with another qubit, we automatically learn about and modify the state of its partner. This provides a sort of quasi-parallelism that allows a quantum system to perform some calculations faster than a classical computer."
Confused yet? If so, that's okay--the point is qubits can be used to run algorithms like Simon's algorithm that are faster than anything a transistor-based computer can accomplish. Wikipedia has a whole section on the potential of quantum computing and how quantum algorithms could do things our current computers will, theoretically, never be capable of.
Sounds great, huh? So here's the tricky part: actually making a quantum computer that can take advantage of all that potential qubits exhibit. A recent project from the University of California, Santa Barbara actually built a rudimentary microprocessor using qubits, but it's a far cry from an Intel or AMD chip. Ars Technica explains:
The computer is rather simple: a two-qubit register made from SQUIDs (superconducting quantum interference devices), two additional SQUIDs that can be used to zero the register (and act as readout), and microwave resonator striplines, which act as memory. The most significant part, however, is a bus that couples the two register qubits together. This bus enables the researchers to program the register to perform different logic operations. . . .
This all works through the magic of magnetic fields. . . .The microwave frequency that a SQUID likes to operate at depends on the magnetic field it is exposed to. The resonators have a fixed geometry that will only resonate at one microwave frequency. So a memory can be read or written by changing the magnetic field so that it is the same as that as the resonator. The same is true of the zeroing registers.
As a result, operations are really just a case of ramping magnetic fields up and down. Operations between qubits are performed by applying microwave pulses on the bus between them.
The real kicker is that the quantum computer's qubits only stay in an entangled state for 400 nanoseconds. That's 400 nanoseconds of operation--not exactly long enough to substitute for the computers we use day in and day out. Even though there are ways to lengthen that operating time, quantum computing has a long, long way to go.But it is progressing: the Santa Barbara project showed up in a New York Times article last year, and the things the team talked about then have been accomplished in the past 11 months. IBM has embarked upon a five year project to try its hand at quantum computing. We won't be able to buy quantum computers off of store shelves half a decade from now, but the efforts of IBM and UC Santa Barbara will inch us closer.
If someone makes a breakthrough in developing a fault-tolerant system with quantum error correction, it might be time to get excited about quantum computing--that will be a Godzilla-sized step along the path that ends in a genuinely usable machine.
Lead image via D-Wave systems