There are two main schools of thought in physics: conventional or Newtonian physics and quantum physics. Generally speaking, conventional physics is about big stuff: gravity, forces, light and so on. Quantum physics is about what happens when you go really small, sub-atomic or even smaller.

The theories of conventional physics and quantum physics both explain what happens in the world. However, here’s where it starts to get complicated: they are actually contradictory even though they both seem to be correct. There is presumably a crossover point where one stops working and the other starts—but this is into the cutting edge of research.

Quantum physics and computing

What does this have to do with computers? 

First, a quick explainer of computers. Computers are basically made up of a series of transistors, or switches. Each switch can allow a stream of electrons (sub-atomic particles) to pass through it. Each switch can either be on or off, a state of either 1 or 0 in computer terms (the 1/0 is information, or bits). The transistors are combined into logic gates, and then into much larger processors, but for our purposes, it is enough to know about the switches.

Here’s the issue. Transistors have now become so small that they are barely an atom wide. The problem is that this is roughly where conventional physics stops working, and quantum physics kicks in. It turns out that when things get any smaller than this, electrons don’t necessarily behave the way that you would expect. Instead of playing nice, and waiting for the switch to open, they can ‘jump’ to the other side in a process called ‘quantum tunnelling’.

In other words, conventional transistors can’t be made any smaller. 

If we want to keep increasing computer power for the same size of processor, we need to find another way—and quantum computing may be it.

The theory of quantum computing

In conventional computing, each bit of information can be in the form of either 1 or 0. If you have four bits together, therefore, you can have 16 possible configurations (0000, 0001, 0010, 0011, etc.). However, the four bits can only be in one configuration at a time.

In quantum computing, you cannot use electrons, so you use a flow of photons instead. These are basically light expressed as a particle (the quantum physics form of light, instead of the wave form in conventional physics).

The bits of information are called qubits in quantum physics, and like bits, each can be either 0 or 1. However (and here’s where it gets really complicated), until you measure them, they are not actually in either state. Instead, they are in a state of probability between the two. 

In other words, each qubit exists as an infinite number of possibilities between 0 and 1. This means that when you have four qubits, you don’t just have 16 possibilities, you have an infinite number, and each qubit can also be in each possibility at once. 

What this means for computing

Without thinking too hard about it, you can probably see that this would exponentially increase the computing power at your disposal. Conventional computers are now very fast, but a quantum processor would be infinitely faster. It would be able to do things in nanoseconds that currently take weeks or months.

This would make it possible to do things that we can only dream about now: huge calculations, for example. The applications coming up in our next story. 

Leave a Reply

Your email address will not be published. Required fields are marked *