To me quantum computation is a new and
deeper and better way to understand the laws
of physics, and hence understanding physical
reality as a whole. We are really only
scratching the surface of what it is telling us
about the nature of the laws of physics.
David Deutsch
The evolution of the modern computer has involved a series of changes from one type of physical realization to another: from gears to relays to transistors to integrated circuits… Just 75 years ago, during the Second World War, Alan Turing decoded encrypted radio messages using what is considered the first fully-digital computer: The Colossus. A machine the size of a room that weighted about a ton, used hundreds of vacuum tubes to store information, and whose input was through punched tapes. Since then, computers have evolved to become faster, smaller and more powerful, and this evolution has been made possible by new discoveries, new techniques and new physical and engineering models.
According to Moore’s law, processor speeds, or overall processing power for computers should double every 18 months, but as all exponential growth, this evolution is reaching its limits.

Moore’s law: the number of transistors in a dense integrated circuit doubles approximately every 18 months.
As the size of the computer’s processor is reduced to the microscopic scale, electrons in the electrical circuits begin to reveal their quantum nature, and Classical Physics are no longer valid. Therefore, to solve this problem the following issue arises: whether we develop new semiconductor chips that allow us to evade the electron’s quantum nature, or we use the principles of Quantum Mechanics to develop new computers and new ways of processing information.
Quantum Mechanics (QM) is certainly one of the most important physical theories of the twentieth century. Up until the late nineteenth century, Newtonian Mechanics were used to understand physical phenomena linked to the motion of matter, and Maxwell’s Electromagnetism to describe those related with radiation. However, there were many phenomena that could not be described without the appearance of theoretical results in complete contradiction with experiments and measurements. Examples of these are Rutherford’s dispersion, the photoelectric effect, and experimental results related to black body radiation.
Thus, a new theory was created to solve these problems and to describe these and other microscopic phenomena: Quantum Mechanics. While there is no exact date, many authors suggest that this theory was born in 1900, when German theoretical physicist Max Planck introduced the concept of quantized energy to explain the spectral distribution of black body radiation. However, it was not until 1928 that physicists Bohr and Dirac established the theoretical basis and concept of Quantum Mechanics.
This revolutionary theory led physicists and scientists to rethink and reevaluate physics as it was known to date, thus introducing a whole new way of describing physical systems. For instance, they discovered the fact that, in general, the state of a quantum system cannot be described from the states of it’s constituents due to the existence of quantum correlations between them. In 1935, Einstein, Podolski and Rosen were the firsts to put the spotlight on quantum correlated states, and it was Erwin Schrödinger who coined the term “entanglement” to name those quantum correlations.
Entanglement is one of the most unique and surprising quantum phenomena, and such as Heisenberg uncertainty principle, it has no classical analog.
One of the most important consequences of the existence of entanglement is related to the quantity of information necessary to describe the state of a system. In a classical system, the number of parameters needed to characterize it’s state (the number of degrees of freedom) is equal to the sum of the degrees of freedom of each of its components. However, for a quantum system, the number of parameters needed to describe it’s state grows exponentially with the number of constituents. And this makes the task of simulating a quantum system a very difficult one (even for systems with few constituents), since the computational resources needed for the simulation quickly become unreachable using computational techniques based on classical computers. However, in 1982 Richard Feynman conjectured that these limitations could be overcome using computers based on quantum systems.
Based on this conjecture, physicists started using the potential of QM to develop new computers capable not only of simulating quantum systems but also capable of solving other problems that were (until that moment) intractable on classical computers. This gave birth to a new interdisciplinary field of study called Quantum Information.
Thus, in 1985 David Deutsch proposed the first quantum algorithm, and proved that it could perform a task with greater efficiency than it’s classical counterpart. Deutsch also proposed one of the first universal quantum computers. From this moment on, several quantum algorithms have been proposed, being one of the most important the one that Shor proposed in 1994. With his algorithm, Shor showed that quantum computers can factor large numbers exponentially faster than any classical computer. Another quantum algorithm of great interest is Grover’s search algorithm, discovered in 1997. This algorithm achieves a quadratic reduction in the number of steps required to search an item in a disordered database. Furthermore, quantum teleportation protocol demonstrated that Quantum Mechanics can also be used to generate new ways of transmitting information.
As I already mentioned, Quantum Entanglement is the main features of QM that makes these advances possible. It is a crucial resource for solving computational problems with a much greater efficiency than what had been proved insurmountable with classical computers. It’s for these reasons that in the last decade the study of quantum entanglement has aroused great interest in the fields of study of quantum information, quantum computation, condensed matter and statistical mechanics, providing a new perspective for the analysis of correlations, entropy and phase transitions.
I believe that every now and then we can actually see and predict the future of science and technology, and I am sure that quantum information and quantum computation will play a crucial role in defining where the course of our technology is headed.
This blog is dedicated to any one who is interested in studying Quantum Information and Quantum Computation. Here you will be able to take a behind-the-scenes tour of the fascinating research that gets done in these fields of study.
I will try to make this blog as inclusive as possible, if you haven’t had much contact with Quantum Information and Quantum Computation, you will find here all the definitions, demonstrations and concepts you will need (like entanglement measurements, quantum discord, quantum algorithms, linear algebra (yeah, you’ll need it!), and so on). In order to take full advantage of this blog I recommend you take an introductory course to Quantum Mechanics, and if you already have, I bid you welcome.
Marco Cerezo
All text copyright © Marco Vinicio Sebastian Cerezo de la Roca.
This Blog’s Motivation: A Brief Introduction to Quantum Information and Quantum Computation by Marco Vinicio Sebastian Cerezo de la Roca is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.
https://independent.academia.edu/GeorgeRajna/Quantum-Computing