Quantum Supremacy:

Quantum Supremacy ( in Computing) 

In 2012, Prof John Preskill, a professor of Theoretical Physics at California Institute of Technology, Pasadena, California proposed the term “quantum supremacy” to describe the point where quantum computers can do things that classical computers can’t,  whether or not those tasks are useful.

Quantum mechanics emerged as a branch of physics in the early 1900s to explain nature on the scale of atoms and led to advances such as transistors, lasers, and magnetic resonance imaging. The idea to merge quantum mechanics and information theory arose in the 1970s but garnered little attention until 1982, when physicist Richard Feynman gave a talk in which he reasoned that computing based on classical logic could not tractably process calculations describing quantum phenomena. Computing based on quantum phenomena configured to simulate other quantum phenomena, however, would not be subject to the same bottlenecks. Although this application eventually became the field of quantum simulation, it didn’t spark much research activity at the time.

In 1994, however, interest in quantum computing rose dramatically when mathematician Peter Shor developed a quantum algorithm, which could find the prime factors of large numbers efficiently. Here, “efficiently” means in a time of practical relevance, which is beyond the capability of state-of-the-art classical algorithms. Although this may seem simply like an oddity, it is impossible to overstate the importance of Shor’s insight. The security of nearly every online transaction today relies on an RSA cryptosystem that hinges on the intractability of the factoring problem to classical algorithms.

What is Quantum Computing?

Quantum and classical computers both try to solve problems, but the way they manipulate data to get answers is fundamentally different. What makes quantum computers unique are two principles of quantum mechanics that are crucial for their operation, superposition and entanglement.

Superposition is the ability of a quantum object, like an electron, to simultaneously exist in multiple “states.” With an electron, one of these states may be the lowest energy level in an atom while another may be the first excited level. If an electron is prepared in a superposition of these two states it has some probability of being in the lower state and some probability of being in the upper. A measurement will destroy this superposition, and only then can it be said that it is in the lower or upper state.

Understanding superposition makes it possible to understand the basic component of information in quantum computing, the qubit. In classical computing, bits are transistors that can be off or on, corresponding to the states 0 and 1. In qubits such as electrons, 0 and 1 simply correspond to states like the lower and upper energy levels discussed above. Qubits are distinguished from classical bits, by their ability to be in superpositions with varying probabilities that can be manipulated by quantum operations during computations.

Entanglement is a phenomenon in which quantum entities are created and/or manipulated such that none of them can be described without referencing the others. Individual identities are lost. This concept is exceedingly difficult to conceptualize when one considers how entanglement can persist over long distances. A measurement on one member of an entangled pair will immediately determine measurements on its partner, making it appear as if information can travel faster than the speed of light. This apparent action at a distance was so disturbing that even Einstein dubbed it “spooky action at a distance”.

Building quantum computers is incredibly difficult. Many candidate qubit systems exist on the scale of single atoms, and the physicists, engineers, and materials scientists who are trying to execute quantum operations on these systems constantly deal with two competing requirements. First, qubits need to be protected from the environment because it can destroy the delicate quantum states needed for computation. The longer a qubit survives in its desired state the longer its “coherence time.” From this perspective, isolation is prized. Second, however, for algorithm execution qubits need to be entangled, shuffled around physical architectures, and controllable on demand. The better these operations can be carried out the higher their “fidelity.”

Superconducting systems, trapped atomic ions, and semiconductors are some of the leading platforms for building a quantum computer. Each has advantages and disadvantages related to coherence, fidelity, and ultimate scalability to large systems. It is clear, however, that all of these platforms will need some type of error correction protocols to be robust enough to carry out meaningful calculations, and how to design and implement these protocols is itself a large area of research.

A different  framework being pursued by Microsoft  is topological computation, in which qubits and operations are based on quasiparticles and their braiding operations. While nascent implementations of the components of topological quantum computers have yet to be demonstrated, the approach is attractive because these systems are theoretically protected against noise, which destroys the coherence of other qubits.

Google has officially announced in October 2019 that it has achieved Quantum Supremacy in a new article published in the scientific journal Nature.

Google says that its 54-qubit Sycamore processor was able to perform a calculation in 200 seconds that would have taken the world’s most powerful supercomputer 10,000 years. That would mean the calculation, which involved generated random numbers, is essentially impossible on a traditional, non-quantum computer.

With time, the tech will get democratised and trickle down to the consumer. An industry around QC software and algorithms will then have truly arrived.

As the number of qubits in quantum computers increase, we will first start seeing optimisation and data access problems being solved first. For example, with enough qubits, we could use quantum computers to assemble and sort through all possible gene variants parallelly and find all pairs of nucleotides – the building blocks of DNA – and sequence the genome in a very short period of time.

This would revolutionise the health industry as sequencing the DNAs at scale would allow us to understand our genetic makeup at a deeper level. The results of access to that kind of knowledge are unfathomable.

Next, through significant improvements in our quantum capacity, we will be able to use quantum computers for simulating complex systems and behaviours in near real-time and with high fidelity.

Imagine simulating the earth’s winds and waves with such accuracy so as to predict storms days before they come. Imagine simulating how the winds on a particular day would interact with a flight on a particular day and route – it would allow us to measure turbulence, optimise flight paths, and better in advance.

One promising candidate is PsiQuantum whose photon-based model is still years away, but the company’s big claim is that its technology will be able to string together 1 million qubits and distill out 100 to 300 error-corrected or “useful” qubits from that total. has raised $215 million to build a computer with 1 million qubits, or quantum bits, within “a handful of years”. Rudolph, the company’s chief architect, happens to be the grandson of famed quantum theorist and Nobel Prize-winning Erwin Schrödinger.

Regardless of the path it takes, Quantum Computing is here to stay. It’s a key piece in the puzzle that is human growth. 10 years, 100 years, or maybe even a 1,000 years down, we will wonder how we lived without them.

————————————-

About mmpant

Prof. M.M.Pant has a Ph.D in Computational Physics, along with a Professional Law Degree, and has been a practitioner in the fields of Law, IT enabled education and IT implementation. Drawing upon his experience in world class international institutions and having taught in various modes of Face-to-Face, Distance Learning and Technology Enhanced Training, Prof. Pant is now exploring the nature of institutions which will be successors to the IITs, which represented the 1960s, IIMs, which represented the 1970 and Open Universities which were the rage of 1980s & 90s. He believes that the convergence between various media and technologies would fundamentally alter the way learning would be created, packaged, and delivered to learners. His current activities are all directed toward actual implementation of these new age educational initiatives that transform education in the post Internet post WTO era.. Prof. Pant, has been a Former Pro-Vice Chancellor, Indira Gandhi National Open University (IGNOU) and has been on the faculty of IIT – Kanpur (the premier Engineering institution in India), MLNR Engineering College and Faculty & Visiting Professor - University of Western Ontario-Canada. He has been visiting scientist to research centers in Italy, England, Germany & Sweden and has delivered international lectures with about 80 papers published. During his association of almost 15 years with the IGNOU, Prof. Pant has served as the Director Computing and has been the Member of All Bodies (i.e. School boards, Academic council, Planning board, Finance committee and the Board of management). With his interest in Law, backed with practice of Law in a High Court, and his basic training in Science and IT, Prof. Pant has been particularly interested in the Cyber Law, Patent & trade mark issues, Intellectual Property Rights (IPR) issues etc. and has been involved with many activities, conferences on “Law & IT” Prof. Pant is presently; • Advisor to Media Lab Asia - Chairman of working group on ICT for Education, chairman of PRSG handling projects on ICT for education. • Lead Consultant for an ADB funded project for ICT in Basic Education in Uzbekistan • Member of the drafting Group for India’s National Policy on ICT in education • Chairman of the group creating books for class 11 and 12 students on ‘Computers and Communication Technology’ appointed by the NCERT • Preparing a ‘Theme Paper” for the NCTE in the area of ICT and Teacher Training • Advisor and mentor to several leading Indian and Multi-national Companies in the area of education. Prof. Pant has in the recent past been ; • Member – Board of Management – I I T, Delhi for 6 years (two consecutive terms) • One-man committee to create the Project Report & Legislation for Delhi IT-enabled Open University • Advisor to the Delhi Government on Asian Network of Major Cities Project (ANMC-21) distance learning project in association with Tokyo Metropolitan Government. • Chairman Board of Studies, All India Management Association With his mission to create and implement new business opportunities in the area of e-learning & learning facilitation, Prof. Pant has promoted Planet EDU Pvt. Ltd., as its Founder & Chairman, along with a team of highly experienced and skilled professionals from Education & Training, Operations, IT and Finance.
This entry was posted in Uncategorized. Bookmark the permalink.

Leave a Reply