Quantum Computing: The Hype and Reality
Introduction
With quantum computing, we are witnessing an exciting and very promising merging of three of the deepest and most successful scientific and technological developments of modern era: quantum physics, computer science, and nanotechnology. Although experimental developments are in their infancy, there have already been a variety of concepts, models, methods, and results obtained at the theoretical level that clearly demonstrated the lasting value and much future potential. Quantum theory is not new; from its birth in 1900, quantum mechanics has had an unreal, too-strange-to-be-true quality to it.
Computing design is currently still following the traditional approach of struggling to squeeze more devices onto a computer chip. Suppose, however, that we wanted to make extremely small computers—say the size of just a few atoms. We would no longer be governed by classical laws, but by the laws of quantum mechanics and nanotechnology.
A vivid and dramatic example of quantum computing can be made with a comparison against conventional computing architectures of today. Consider the need for a software program that analyzes every possible combination of 100 flipped coins, which means that there are 2 to the power 100 possible configurations. Using a traditional PC with 32-bit architecture and a 2 gigahertz clock speed would require in excess of 1 trillion continuous computing years to complete (the age of the known universe is just over 20 billion years). A quantum computer with 100 quantum bits could accomplish the same task in less than a second. Although this is a simple example, it does illustrate the potential of this computing paradigm when it will be used for computing intensive problems in science, astronomy, physics, and other related fields.
All we actually need are "bits" that communicate. An atom or nucleus will do just as well as binary notation because they are natural "spin-systems." They have measurable physical parameters that we can use to store information by associating them with different states.
What gives this potential is that there are no further limitations on size imposed by quantum mechanics, unlike classical computers. To explain what makes quantum computers so different from their classical counterparts, we can begin by having a closer look at a basic chunk of information—one bit. From a physical point of view, a bit is a physical system that can be prepared in one of the two different states representing two logical values: no or yes, false or true, or simply 0. (More on this in the next section). The bit is the basis of quantum computing and what the excitement is about for those who research and develop it.
The first generation of quantum computers will have components that behave according to quantum mechanics, but the algorithms that they run will probably not involve quantum mechanics. No matter how fast conventional computers become, there will always be some calculations that are too large for them to complete in reasonable time. Hoping to circumvent these limitations, physicists have begun to seriously entertain the possibility that a radically different type of computing could solve certain kinds of problems that a conventional computer could not solve in the lifetime of the universe (as the example of the 100 flipped coins illustrates).