atoms floating in front of a computer mother board

Shutterstock

We’ve looked before at some of the most interesting developments in modern computing and development, the kind of advances that could represent fundamental changes in the way business and society work. These include developments like blockchain and neural networks, but today we’re going to look at something even more fundamental that could potentially change the way computers handle information at the most basic level: quantum computing.

But before we get into the potential applications of this technology, a quick lesson in the basics of physics and computer science might be helpful. (Don’t worry, it won’t get very technical.)

Beyond (Or Between) Zeros and Ones

We all know that computer processors run on binary code that reduces all data to a series of zeros and ones. In normal (or “classical”) computing, all information is encoded in these bits, which can only ever exist as zeros or ones.

Quantum computing, on the other hand, takes advantage of our knowledge of how certain subatomic particles behave. To cut out a lot of very complex physics, quantum computing allows pieces of data (called quantum bits, or qubits) to exist in multiple states simultaneously. (In quantum mechanics this is called superposition.) That means that a qubit, unlike a bit, can be a zero, a one, or any combination of zero and one simultaneously.

The other major property of quantum computing systems is that the state of every qubit is connected to the state of every other qubit in the system. (In quantum mechanics this is called entanglement, but again you don’t have to worry about the exact physics.) For every qubit you add to the system, the total processing power roughly doubles, since you’re essentially doubling the number of states the system is capable of modeling simultaneously.

Advantages Over Classical Computing

What this means in practice is that processors made up of qubits can potentially handle exponentially more complex computations than a classical computer with the same number of binary bits.

Dr. Talia Gershon of IBM uses the example of guests at a dinner table. How many ways are there to arrange 10 people around a table? The answer is 10!, or 3.6 million. And every extra person you add to that table increases the number of combinations exponentially. So what do you do if you want to find the best way to arrange those people around a table?

This is the kind of question that quickly overwhelms a classical computer, because a classical computer would need to consider each possible combination individually and then compare them. Quantum computing takes a totally different approach, using the properties of superposition and entanglement to model all 3.6 million possibilities simultaneously, and then using the properties of waves to amplify or cancel out different possibilities until the best answer is found.

If this all sounds confusing, it’s because it represents a totally different way to think about solving complex problems.

Note that, in the above example, this same process could be applied just as easily to cracking encryption keys as to arranging people around a dinner table. We’ve written before that it’s practically impossible for a classical (i.e. regular) computer to crack modern forms of encryption, but that’s not the case for a hypothetical quantum computer.

Other applications of quantum computing include:

  • Modeling chemical compounds to aid in pharmaceutical research
  • Optimizing complex supply chains
  • Modeling complex financial data to identify risks and opportunities
  • Increasing the processing power of neural networks
  • Modeling the effects climate change under different scenarios

The Current State of Quantum Computing

While quantum computing has the potential to change the way we approach many of the world’s most complex problems, the technology itself is still in its infancy. As of this writing, the most powerful quantum computers don’t even hold a candle to classical supercomputers.

That’s because the properties of quantum computing also create their own shortcomings, including potentially high error rates and the need to keep the machines at super low (as in approaching absolute zero) temperatures. Researchers at major universities and companies like Google, IBM, and Intel are all racing to develop more stable qubit processors. That said, it could be years before quantum computers become commercially viable.