Breakthroughs in the Quantum Realm: Understanding Quantum Computing
Early last week, a group of physicists from Purdue University in Indiana and the University of New South Wales in Australia managed to construct a transistor (the bare bones of any working computational device, a transistor is simply a voltage-controlled on/off switch for electrical devices) consisting of an atom of phosphorus placed inside a crystal of silicon.
Following on the heels of this breakthrough, yesterday a group of IBM scientists released news of even more breakthroughs that will put us “on the cusp of building systems that will take computing to a whole new level.” The two most glaring challenges they seem to have met, were finding a way to reduce the inevitable errors that crop up in quantum computation, which degrades further computational results, and ensuring the quantum mechanical properties of so-called 'qubits', which are the quantum-informational analog to the familiar bit. A successful quantum computer will enable calculations far in excess of what is even possible with the biggest, most powerful normal computer; for a better understanding of just how powerful, a short history of processing power is in order.
The first supercomputer ever built, the ENIAC was reported to take much of the floor space of the United States Military’s Ballistic Research Laboratory, and weighed as much as a tank. A magnificent breakthrough in computational power in post-war 1945, the ENIAC was capable of 5000 calculations per second at peak performance. Today, it is easily overshadowed by your modern desktop, which can perform about 3 billion operations per second. This incredible increase took half-a-century to reach; as we move steadily into this new century, quantum computing appears poised to supersede the abilities of today’s computers by a similar factor.
Everyone has noticed that despite the monstrous increases in nearly every parameter that characterizes a computers performance; one parameter has consistently continued to decrease – computer size. In the transition from vacuum tubes to transistors, scientists and engineers have made the latter smaller and smaller despite higher and higher demands for processing power. Obviously, this can’t go on forever; the trend that processing power will double every two years up to a physical limit (atomic limit) is the observation of Moore’s Law. After this limit, we essentially go into the atomic realm, where few things are linear. The tablet pc is the most current consumer-relevant example of this rule, which was pretty much the driving force behind the mobile revolution.Continued on the next page