Quantum Supremacy ( in Computing)
In 2012, Prof John Preskill, a professor of Theoretical Physics at California Institute of Technology, Pasadena, California proposed the term “quantum supremacy” to describe the point where quantum computers can do things that classical computers can’t, whether or not those tasks are useful.
Quantum mechanics emerged as a branch of physics in the early 1900s to explain nature on the scale of atoms and led to advances such as transistors, lasers, and magnetic resonance imaging. The idea to merge quantum mechanics and information theory arose in the 1970s but garnered little attention until 1982, when physicist Richard Feynman gave a talk in which he reasoned that computing based on classical logic could not tractably process calculations describing quantum phenomena. Computing based on quantum phenomena configured to simulate other quantum phenomena, however, would not be subject to the same bottlenecks. Although this application eventually became the field of quantum simulation, it didn’t spark much research activity at the time.
In 1994, however, interest in quantum computing rose dramatically when mathematician Peter Shor developed a quantum algorithm, which could find the prime factors of large numbers efficiently. Here, “efficiently” means in a time of practical relevance, which is beyond the capability of state-of-the-art classical algorithms. Although this may seem simply like an oddity, it is impossible to overstate the importance of Shor’s insight. The security of nearly every online transaction today relies on an RSA cryptosystem that hinges on the intractability of the factoring problem to classical algorithms.
What is Quantum Computing?
Quantum and classical computers both try to solve problems, but the way they manipulate data to get answers is fundamentally different. What makes quantum computers unique are two principles of quantum mechanics that are crucial for their operation, superposition and entanglement.
Superposition is the ability of a quantum object, like an electron, to simultaneously exist in multiple “states.” With an electron, one of these states may be the lowest energy level in an atom while another may be the first excited level. If an electron is prepared in a superposition of these two states it has some probability of being in the lower state and some probability of being in the upper. A measurement will destroy this superposition, and only then can it be said that it is in the lower or upper state.
Understanding superposition makes it possible to understand the basic component of information in quantum computing, the qubit. In classical computing, bits are transistors that can be off or on, corresponding to the states 0 and 1. In qubits such as electrons, 0 and 1 simply correspond to states like the lower and upper energy levels discussed above. Qubits are distinguished from classical bits, by their ability to be in superpositions with varying probabilities that can be manipulated by quantum operations during computations.
Entanglement is a phenomenon in which quantum entities are created and/or manipulated such that none of them can be described without referencing the others. Individual identities are lost. This concept is exceedingly difficult to conceptualize when one considers how entanglement can persist over long distances. A measurement on one member of an entangled pair will immediately determine measurements on its partner, making it appear as if information can travel faster than the speed of light. This apparent action at a distance was so disturbing that even Einstein dubbed it “spooky action at a distance”.
Building quantum computers is incredibly difficult. Many candidate qubit systems exist on the scale of single atoms, and the physicists, engineers, and materials scientists who are trying to execute quantum operations on these systems constantly deal with two competing requirements. First, qubits need to be protected from the environment because it can destroy the delicate quantum states needed for computation. The longer a qubit survives in its desired state the longer its “coherence time.” From this perspective, isolation is prized. Second, however, for algorithm execution qubits need to be entangled, shuffled around physical architectures, and controllable on demand. The better these operations can be carried out the higher their “fidelity.”
Superconducting systems, trapped atomic ions, and semiconductors are some of the leading platforms for building a quantum computer. Each has advantages and disadvantages related to coherence, fidelity, and ultimate scalability to large systems. It is clear, however, that all of these platforms will need some type of error correction protocols to be robust enough to carry out meaningful calculations, and how to design and implement these protocols is itself a large area of research.
A different framework being pursued by Microsoft is topological computation, in which qubits and operations are based on quasiparticles and their braiding operations. While nascent implementations of the components of topological quantum computers have yet to be demonstrated, the approach is attractive because these systems are theoretically protected against noise, which destroys the coherence of other qubits.
Google has officially announced in October 2019 that it has achieved Quantum Supremacy in a new article published in the scientific journal Nature.
Google says that its 54-qubit Sycamore processor was able to perform a calculation in 200 seconds that would have taken the world’s most powerful supercomputer 10,000 years. That would mean the calculation, which involved generated random numbers, is essentially impossible on a traditional, non-quantum computer.
With time, the tech will get democratised and trickle down to the consumer. An industry around QC software and algorithms will then have truly arrived.
As the number of qubits in quantum computers increase, we will first start seeing optimisation and data access problems being solved first. For example, with enough qubits, we could use quantum computers to assemble and sort through all possible gene variants parallelly and find all pairs of nucleotides – the building blocks of DNA – and sequence the genome in a very short period of time.
This would revolutionise the health industry as sequencing the DNAs at scale would allow us to understand our genetic makeup at a deeper level. The results of access to that kind of knowledge are unfathomable.
Next, through significant improvements in our quantum capacity, we will be able to use quantum computers for simulating complex systems and behaviours in near real-time and with high fidelity.
Imagine simulating the earth’s winds and waves with such accuracy so as to predict storms days before they come. Imagine simulating how the winds on a particular day would interact with a flight on a particular day and route – it would allow us to measure turbulence, optimise flight paths, and better in advance.
One promising candidate is PsiQuantum whose photon-based model is still years away, but the company’s big claim is that its technology will be able to string together 1 million qubits and distill out 100 to 300 error-corrected or “useful” qubits from that total. has raised $215 million to build a computer with 1 million qubits, or quantum bits, within “a handful of years”. Rudolph, the company’s chief architect, happens to be the grandson of famed quantum theorist and Nobel Prize-winning Erwin Schrödinger.
Regardless of the path it takes, Quantum Computing is here to stay. It’s a key piece in the puzzle that is human growth. 10 years, 100 years, or maybe even a 1,000 years down, we will wonder how we lived without them.
————————————-