History

History of Quantum Computing: Simplified

Recognizing the very people and organizations who, over several decades, cumulatively laid the foundations of quantum computing is key to understanding the wider potential of this area. This article is meant to be a condensed summary of nearly one and a half centuries of quantum physics and quantum computing.
History of Quantum Computing: Simplified

Quantum computing, with its dizzying ties to both quantum mechanics and computer science, forms a bit of a daunting subject. The questions of “What is quantum computing?” and “How does quantum computing work?” can not be fully answered without a brief investigation of its history, however. Recognizing the very people and organizations who, over several decades, cumulatively laid the foundations of quantum computing is key to understanding the wider potential of this area. This article is meant to be a condensed summary of nearly one and a half centuries of quantum physics and quantum computing.

Grab your detective hats and magnifying glass -- let’s investigate the history of quantum computing. 

Starting at the Beginning: Quantum Physics and the Battle of the Minds

Throughout the 19th century, scientists’ understanding of the atomic model was beginning to shift towards the concept of subatomic particles -- neutrons, electrons, and protons -- and their corresponding characteristics. John Dalton’s atomic theory, incorporating concepts such as matter being composed of atoms and atoms existing in certain integer ratios, had begun to take its hold in the earlier part of the century. 

Yet numerous experimental phenomena, ranging from English scientist Michael Faraday’s cathode rays in 1838 to the discovery of the photoelectric effect in 1887 by German physicist Heinrich Hertz hinted at an entire world beyond mere atoms. Quantum physics scientists took the aforementioned concepts relating to chemistry and expanded upon them to construct the foundations that would lead into quantum computing. For instance, Faraday’s experiment showed the existence of charged subatomic particles. 

A primary breakthrough, however, occurred in 1900, when Max Planck proposed his now-famous quantum theory (and also Planck’s law). Much like the curiosities shown in Faraday’s and Hertz’s experiments, he had noticed there was an anomaly in experimental data depicting blackbody radiation that differed from what classical physics predicted. 

However, in order to make his theory plausible, energy had to be in little packets, which became referred to as “quanta”. Partially for this reason, Planck became considered one of the (three) “Founding Fathers of Quantum Theory”.

Another individual bestowed with this title was Niels Bohr, a Danish physicist. In 1927, Bohr famously went head-to-head against Albert Einstein (the other “Founding Father” for his work on the photoelectric) on the nature of existence at the Solvay Conference in Brussels. Bohr, and several others, believed that particles, such as electrons, existed as merely probabilities between their possible states until they are observed. 

Such a point of view was reflected in Schrödinger’s cat -- or, rather, physicist Erwin Schrödinger’s fictional thought experiment involving a feline. Simply put, the poor cat is placed in an isolated box. While in the box, there is some probability of the cat being either dead or alive -- its true state is not known until the box is opened and the cat observed. 

The quantum concept underlying this is known as superposition: it is impossible to know the state of an object until measured. Einstein, on the other hand, disagreed strongly with this probabilistic viewpoint, famously stating, “God does not play dice.” 

Bohr was also in support of quantum entanglement (quantum entanglement theory), which Einstein called “spooky action at a distance”, in which two paired particles will always have opposite spins, regardless of how far away they are seemingly separated. The apparent strangeness of this behavior is demonstrated in the Einstein-Podolsky-Rosen Paradox (EPR paradox). 

Such properties discovered (from entanglement to superposition and so on), nevertheless, would come to be key principles in quantum computing and also form part of quantum physics foundations. The collective study of quantum physics between 1900 and 1925, which largely consisted of adjustments to classical physics to fit the slew of theoretical and experimental advancements, came to be referred to as “old quantum theory”.

Entrance of Quantum Computing: The Lockpicker’s Conjecture

Niels Bohr also was known to correspond with a certain “curious” personality. In the same year (1918) Max Planck received a Nobel Prize for his work on blackbody radiation, Richard Feynman -- esteemed quantum physicist, lockpicker, and (partial) drummer -- was born. Bohr reportedly searched for Feynman to speak to because the latter was apparently not “in awe” of Bohr and would argue freely with the older physicist. Feynman’s other adventures, including lockpicking filing cabinets while working on the atomic bomb in Los Alamos (see book Surely You’re Joking, Mr. Feynman! for more exploits), made him quite an entertaining personality overall.

In 1980, Feynman, along with several others, began to consider the possibility of simulating quantum systems with other quantum systems put together specifically for this purpose. In other words, he began to hypothesize the possibility of taking a step away from two-state systems (since classical computers operate using binary numbers) and taking a step towards computers capable of calculating quantum systems with more than two states. This came to be known as Feynman’s conjecture. The conjecture would not be answered until 1996 by Seth Lloyd. 

At the same conference where Feynman discussed his conjecture (the First Conference on the Physics of Computation at MIT), physicist Paul Benioff also discussed how computers are capable of working under the established laws of quantum mechanics, suggesting the feasibility of a quantum computer.

Within a decade or two, David Deutsch would publish a paper that described a theoretical machine that did just that. Perhaps this would be seen as part one of two in considering precisely when quantum computing was invented. In 1985, he proposed what became known as the Quantum Turing Machine (QTM), a theoretical machine that modeled the mathematics behind quantum computation using quantum “gates”, a set of operations that permit a multistate computer. 

A mere nine years after David Deutsch’s paper, in 1994, Peter Shor would set into motion an explosion of research into quantum computing through his development of Shor’s Algorithm. At the time, the MIT alumni (he would later return as a professor) was employed by Bell Labs, then under the control of AT&T. 

Shor’s algorithm incorporated both entanglement and superposition to determine the prime factors of an integer -- an issue that has been bugging cryptographers and computer scientists alike for decades (and, to be entirely honest, still hangs on the back of people’s minds today). Ideally, his algorithm was the Usain Bolt to every preschooler on the playground -- the efficiency was brilliant and in fact managed to break the RSA public-key cryptosystem, around which the security of the Internet centers on. 

In parallel, in 1996, Lov Grover, also at Bell Labs, would devise his Grover’s algorithm, a quantum computer based algorithm aimed at conducting a database search and could also be applied on a significantly wider scale. 

To put it simply, the fanfare surrounding quantum computing exploded. Shor’s and Grover’s algorithms were the keys to the first door that led to a flood of intellectual curiosity in the area and, eventually, the general public began to hone in on the possibilities such a concept provided. 

The excitement surrounding quantum computers was not unsupported. In 1993 and 1994, Ethan Bernstein, Umesh Vazirani, and Dan Simon showed quantum computers were significantly faster than classical computers. Simon’s research, in particular, showed quantum computers could in fact be exponentially faster.

From Theory to Reality

With all the enthusiasm surrounding the subject, the National Institute of Standards and Technology (NIST) and California Institute of Technology (Caltech) managed to experiment with certain methods of “shielding” highly sensitive quantum systems from external, environmental interference. They managed to do this in 1995, but the methodology permitted the existence of quantum systems supporting only a few bits that lasted for a short period of time. Nevertheless, the possibility was out there. Between 1995 and 1996, Peter Shor and Andrew Steane attempted to combat the interference with the environment (known as decoherence) by putting forth error-correcting codes.

Finally, in 1998, researchers from MIT, the Los Alamos National Laboratory, and UC Berkeley (Neil Gershenfeld, Isaac Chuang, and Mark Kubinec, respectively) used a technique somewhat alike that developed in 1995 to create (what was arguably) the world’s first functioning quantum computer. The computer itself was a 2-qubit quantum computer that utilized nuclear magnetic resonance to manipulate the “spin” of particles in the atomic nuclei of the fluid they used (to put it simply -- the concept itself is far more in depth and deserves more than this sentence can do it justice). 

Scientists continued to work on making the concept possible and, just two years later (2000), a 7-qubit quantum computer was created. The announcement came slightly after NIST announced they had managed to figure out an alternative method of creating quantum computers (entangling beryllium atoms), although this particular approach has had some difficulty being scaled up. In short, quantum computers do indeed exist. 

However, physicists and scientists in general are often skeptical of quantum computers that could expand beyond ten to fifteen qubits, as the reliability and effectiveness of methods used to decrease decoherence decrease beyond that range. 

Here, your investigation ends. No, not as a hard-stop to the growth of the quantum computing area in 2000 because roadblocks were hit but, rather, as a start to something so much greater.

Industries are turning to quantum computing as a potential path for business ventures, and even some of the world’s most difficult to solve computer science problems have been addressed using quantum computers. With the development of new quantum algorithms, methods of analysis, ways of usage, and even computers themselves, the sky has been marked as the limit for quantum computing.

No items found.

Continue Reading