To me quantum computation is a new and
deeper and better way to understand the laws
of physics, and hence understanding physical
reality as a whole. We are really only
scratching the surface of what it is telling us
about the nature of the laws of physics.
David Deutsch
The evolution of the modern computer has involved a series of changes from one type of physical realization to another: from gears to relays to transistors to integrated circuits… Just 75 years ago, during the Second World War, Alan Turing decoded encrypted radio messages using what is considered the first fully-digital computer: The Colossus. A machine the size of a room that weighted about a ton, used hundreds of vacuum tubes to store information, and whose input was through punched tapes. Since then, computers have evolved to become faster, smaller and more powerful, and this evolution has been made possible by new discoveries, new techniques and new physical and engineering models.
According to Moore’s law, processor speeds, or overall processing power for computers should double every 18 months, but as all exponential growth, this evolution is reaching its limits.

Moore’s law: the number of transistors in a dense integrated circuit doubles approximately every 18 months.
As the size of the computer’s processor is reduced to the microscopic scale, electrons in the electrical circuits begin to reveal their quantum nature, and Classical Physics are no longer valid. Therefore, to solve this problem the following issue arises: whether we develop new semiconductor chips that allow us to evade the electron’s quantum nature, or we use the principles of Quantum Mechanics to develop new computers and new ways of processing information.
Continue reading →