Tuesday, April 7, 2020

Let’s Break Down Quantum Computing

A brief introduction to the most disruptive technology of our time


A look inside the D-Wave 2000QTM System. Photos courtesy of D-Wave. For more information, go to dwavesys.com.


By Tanisha Bassan | The past decade has seen the rapid and continuous growth of computation power–a phenomenon that’s consistent with the predictions outlined in Moore’s Law*. This increase has led to numerous technological innovations, such as graphics processing units that are powerful enough to run machine learning algorithms and virtual reality simulations, and host whole blockchain ledgers.


However, as our transistors decrease in size, Moore’s prediction that computational power will reach a plateau comes into view. This limitation occurs because as the scale of chips reaches an atomic size, the laws of quantum mechanics start interfering with the quality of energy transfer in the circuits.


To continue pushing the limits of what is possible with computation, renowned theoretical physicist Richard Feynman pioneered the field of quantum computing, which utilizes laws of quantum mechanics to enable computational power like never before.


Superposition and entanglement


 A lattice of 2000 qubits, chilled close to absolute zero to harness quantum effects.

A lattice of 2000 qubits, chilled close to absolute zero to harness quantum effects.


Quantum mechanics details two rudimental laws of how subatomic systems behave: superposition and entanglement. Superposition allows an electron to be in all places at the same time through wave-particle duality; simply put: electrons can act like a wave.


Imagine if I draw an X on a page of a book in a library filled with millions of other books and ask someone to find the X. This would classically be done by going through every page of every book until they find the X, which is a daunting task, to say the least.


However, if that same person were in superposition, they would have the ability to look at a large number of pages and books at the same time, drastically reducing the time needed to find the X, which is the essential premise of quantum computers.


Entanglement is the unique connection between two subatomic particles, which doesn’t break no matter how far the particles are from one another. This property is special because knowing information about one particle can reveal information about its pair.


For example, if an electron spinning up on Earth is entangled with an electron on Mars, then we automatically know that the electron on Mars is spinning down. Subatomic particle behavior can be strange but is intricately woven into the fabric of the universe.




Superposition and entanglement form the basis of how quantum computing works, and qubits are the quantum bits that use these properties to transfer information. Some examples include superconducting, photonic, topological, and trapped ions. The scientific community still has to come to a consensus on which qubit is best for quantum computers.


Qubits are placed on a  quantum circuit along with gates, which are operations performed on each qubit to put them into a superposition state. The most common circuit design uses superconductors where electrons can flow left, right or superposition in both directions.


There are already companies making strides in creating qubits using trapped ions or photons, and Microsoft is building a topological qubit. There are different ways of building qubits to possess quantum properties; however, all of them aim to achieve ‘quantum supremacy,’ or the advantage quantum computers have over regular computers.





The scalability of qubit processors is vital to making quantum computers a reality, and so far Google has the largest 72-qubit chip. However, we need millions of qubits to be able to achieve the kind of computing that technologists are envisioning, of which decoherence, or qubits’ interaction with the environment, is the biggest obstacle to overcome.


The challenge of decoherence lies in the hardware–specifically, in connecting a grid of qubits such that no errors are introduced into the system, which is why error correction is such a significant focus for quantum computing scientists.


Errors are introduced when qubits interact unintentionally with the environment. Cosmic rays, heat, light, and the act of measuring a qubit all create errors in the system. The current practice is to isolate the system and cool it down to almost zero degrees Celsius to decrease the influence of decoherence.


Annealers, emulators, and universal quantum computers


There are three levels of quantum computers. Quantum annealers are the most basic form; they are extremely noisy, error-prone, and are only good for optimization problems. Quantum simulators or emulators are computers that can efficiently simulate the quantum properties of nature. One important application would be simulating complex molecules to impact the drug discovery industry. The last tier is the universal quantum computer, which is entirely fault-tolerant and displays all aspects of quantum advantage. We are still around 50 years away from this technology.


We are currently in the noisy intermediate-scale quantum (NISQ) era, which encompasses 50 to 100 qubits with errors and limitations to the number of gates that can be applied to a circuit. A noisy model will only last a certain amount of time before it starts to decohere, where qubits return to their ground state, and the information is lost. Despite this volatility, they can still do useful calculations.


Applications of quantum computing


D-Wave Systems an excellent example of a company that’s succeeding with quantum annealers. Their quantum computers are commercially sold to organizations like NASA for upwards of US$15 million dollars. The team is using quantum computers to address problems like Japan’s traffic congestion by identifying optimal routes for drivers to take, which is similar to the traveling salesman problem–algorithms that quantum computers are particularly good at formulating.


There also exists an essential convergence between quantum computers and machine learning. Creative Destruction Lab is mentoring new startups in the quantum machine learning (QML) space to solve important problems using such frameworks. Other notable companies include Rigetti, IBM, Microsoft, and Intel. Startups like Toronto-based Xanadu are creating photonic quantum computers.


Quantum computers can unlock the barriers of our computational capabilities and solve the world’s most complex problems. The impact is unfathomable if we’re able to start computing more copious amounts of data, simulating nature, and working in polynomial time.


It’s imperative for the world’s leading scientific minds to focus their attention on growing this interdisciplinary field. In the future, all companies will be backed by quantum computers and AI, which together can disrupt all industries as we know it.


*The observation that the number of transistors in a densely integrated circuit doubles about every two years.


About the Author


Tanisha Bassan is a quantum computing developer who is passionate about pioneering practical applications in this industry. At 17-years-old, Tanisha has built a quantum game using Xanadu’s Strawberry Fields, simulated quantum-error correction and built quantum circuits. She participated in the world’s first quantum computing hackathon at Rigetti and built a drug discovery model using the QAOA algorithm. Her mentors are from the University of Waterloo and University of Toronto, where she is attending the first quantum computing graduate course. She is currently working on QML classifications of quantum circuits and hopes to help build the foundations upon which we can use this technology to solve important problems globally.


Email This Post Email This Post

Review overview

Sorry, the comment form is closed at this time.