The days of traditional computers may be numbered. Moore’s law states that the number of transistors (the parts of computers that process information) on a commercially available CPU will double every two years. This seems great, at first, but as companies try to squeeze more and more transistors onto the CPU problems start to emerge. Chips produced by Pentium today have been reduced to a mere 20 atoms across, and when that number drops to five there will be serious problems. As theoretical physicist Michio Kaku puts it, “Computer power simply cannot maintain its rapid exponential rise using standard silicon technology.” Something new has to step in to its place, and that new thing may well be quantum computers.
“Are you going to have a purely quantum computer in five years? No - what you'll have is elements of these things coming out, you always do with technology. In the same way you have a graphics processor card along with a main processor board in a modern computer, you'll see things added on; people will find a means of using quantum computing and the quantum techniques, and that's how I think it'll move forward. And those I can definitely see in the five-year period.”
Professor Alan Woodward, University of Surrey
Entanglement and superpositions
As you get down to the atomic level, a whole new set of laws start to come into play, and these are already starting to present problems for computer manufacturers. To grasp the basics of quantum computers, you have to understand a couple of unusual phenomena from the world of quantum mechanics. This is the atomic world; one in which you can hardly say where anything is and where an electron can spin in two directions simultaneously. Think that this sounds like nonsense? The computer you’re using to read this article is based on the rules of quantum mechanics. In other words, the applications are very real.
The two most important ideas you need to understand are entanglement and superpositions. Electrons, which orbit the nuclei of atoms, have a property called spin, and they may spin either up or down. They also come in pairs, and these pairs spin in opposite directions. Without looking very closely, it’s impossible to know which way one of a pair of electrons is spinning. The standard interpretation of quantum mechanics says that the electron is in a superposition of both up and down spin (and every location in between) until we look at it, which causes it to revert to one state.
The pairs themselves are entangled, meaning that if one is spinning up, the other is down and vice-versa. Simply put, if you have two electrons in a pair, both exist in a superposition of both up and down spin until you look at one of them. At this point, the observed electron reverts to a state (up, for example) and its entangled partner takes on the opposite state (down). Einstein famously called this “spooky action at a distance.” Superpositions and the effect of our observations aren’t easy to get your head around, but they’re essential for quantum computing.
Bits and qubits
Ordinary computers work based on “bits,” which are “on” and “off” positions represented by ones and zeros (binary numbers). Because of superpositions, a quantum computer, made from quantum objects like electrons, photons and atoms, can also be a one and a zero simultaneously. These new bits are called “qubits,” or quantum bits. In a three-bit binary system, a classical computer can represent any number zero to seven at any one time. Three qubits, on the other hand, in superposition states can represent all of the numbers at the same time.
Superpositions essentially mean that qubits can perform numerous calculation simultaneously, rather than one at a time like conventional computers. This means that a computer with 30 qubits would be equivalent to an ordinary computer working at 10 teraflops per second. This is simply a measure of processing speed, and all you need to know to grasp the vast increase in processing power that quantum computers could bring is that our current computers are measured in gigaflops per second.
There are some pretty prohibitive practical issues with making quantum computers, but larger and larger ones are still being produced. The problems arise not only from dealing with quantum objects, but also from the fact that they act differently when we’re looking. This is a pretty confusing idea, but a superposition collapses into one ordinary state if you look at it – so a qubit would essentially become a normal bit if we looked at it. This presents obvious problems for workable quantum computers, because even the act of observation could turn them back into classical computers.
However, entanglement provides the answer. The particles are paired up, and observing one allows you knowledge of the other without directly observing it. By creating pairs of atoms the scientists basically transmit the information from one onto the other, which allows the exchange of information necessary for computation without looking directly at the system and thereby rendering it an ordinary digital computer. Although entanglement might present a solution, it’s far from being put into practice on a significant scale. Scott Aaronson of the Massachusetts Institute of Technology isn’t in a rush, though, “It was more than 100 years between Charles Babbage and the invention of the transistor, so I feel like if we can beat that, then we're doing well.”
What’s the point?
So quantum computers are pretty quick on the processing front and they make use of some cool physics, but what would they actually be used for? The most direct use for quantum computers is for modelling the behaviour of atoms, which is beyond the reach of modern machines. This was first suggested by the late physicist Richard Feynman, and may lead to huge breakthroughs in things like superconductors. The complex array of variables makes modelling atomic behaviour accurately essentially impossible for classic computers, but well within the remit of qubit processing.
The simultaneous processing power of quantum computers also means that certain calculations can be performed much more easily. Most online banking transactions use a type of encryption called RSA, which is based on a large number and its two factors. Calculating the factors from a number like this is essentially impossible for a classical computer, but with quantum technology it could be completed in mere seconds.
There aren’t many clearly defined uses of quantum computers yet, but Scott Aaronson makes an important point, “It's hard for me to envision why you'd want a quantum computer for checking your email or for playing Angry Birds. But to be fair, people in the 1950s said 'I don't see why anyone would want a computer in their home', so maybe this is just limited imagination.”