In “,” which was about useful-to-know Achilles’s heels of the current second wave of AI, I mentioned that these Achilles’s heels are all of a single making: the fact that our underlying technology is digital or, more correctly, discrete.
Current neural networks, for instance, are just data-driven rule-based systems in disguise (with hidden rules). In the end, a computer is a Turing machine and the fact that I am programming it differently doesn’t change its fundamental capabilities. And these capabilities have limitations, as the kind of intelligence you want is not logical rule-based and can also not be emulated with logical rule-based approaches.
That is why there are scientists all over the world are hard at work at creating computing with a radically different foundation: the operations known as quantum mechanics, the weird behavior that underlies the reality that we experience every day. For instance, the fact that you can rest your hand on a seemingly solid table, while in fact it is mostly made up of nothing, is the result of the Pauli exclusion principle, one of the interesting rules of behavior of nature at the quantum level.
All this scientific work captures the imagination of many, and people may even believe that it will not be long before quantum computing takes over the world. You may find a lot of this on tellingly titled blogs like Next Big Future where for instance in the it is simply stated: “Artificial Intelligence and quantum computers will transform IT and logistics.”
Will they? Maybe it is a good idea for another dose of realism. Let’s start with the digital computers that are all-pervasive these days and are the machines that work the current wonders (and Achilles’s heels) of AI.
Digital computers are classical logic machines or, as one of the founders of computing Alan Turing would have it, “discrete state machines,” which are:
. Where a digital computer has bits that are either 0 or 1 (making it discrete), analog computers have values that can be anything between two values (such as 0 and 1). (Technically, digital computers also have states between 0 and 1; for instance, while they are switching from one to the other, but the digital computer has means (the clock) to ignore values during these intermediary states. As Turing said, physically, digital computers exist because we forget about the states where the situation is between 0 and 1.). But the hope of many is directed not at analog computers but quantum computers, computers using the quantum physics that underlies our reality at the smallest scale.
Quantum computers as they are envisioned now are a very special kind of digital computer. Instead of each bit begin either 0 or 1 and nothing else, a qubit (the bit in quantum computing) can during operation be 0 and 1 at the same time (a quantum-specific state called superposition). In such a state, each qubit has a probability of being 0 and a probability of being 1, the sum of which is 1. The moment a qubit is measured, it becomes either a 0 or 1. Quantum computing is about not ignoring what happens when the bit is not 0 or 1, in combination with the weird “being two things at the same time” behavior of the smallest scale of our everyday reality.
The intuitive but false story about quantum computing goes like this:
Being in two states at once provides interesting possibilities for parallelism. Suppose you want to guess an eight-bit number, say an encryption key, that is a value of between 0 and 255. In an eight-bit digital computer, you have to try them all one by one. So, you start with the number 0 (eight 0 bits, or 00000000), then if that fails with the number 1 (seven 0 bits and one 1 bit, or 00000001), etc. If the number you have to guess is 255, you have to do your operation 256 times; on average, you have to do it 128 times.
, , and ).
First, the basics
I am simplifying the story, so it is technically not perfectly correct, but it does give the right idea.
An, at first seemingly small, fly in the ointment is that qubits are very hard to make. Most technologies will have to work at near absolute-zero temperatures. So, making lots of them is a problem. This is called the width of the quantum computer. Scientists can now make roughly 70 qubits working together. The second fly in the ointment is that qubits are fundamentally unstable. You need to work on qubits in isolation, but this isolation cannot be perfect. So, you only have a limited number of quantum operations you can perform before the whole thing crashes. The number of operations you can perform before the thing crashes is called the depth of a quantum computer. And the larger the width, the harder depth becomes. No ones knows if this problem is even solvable. There might be a hard limit around the corner.
To make matters worse, there is also fundamentally a chance to read (measure) a qubit wrong or for quantum operations (the steps in a quantum algorithm) to operate incorrectly, which brings us to another fly: the error rates: Quantum computing is fundamentally uncertain, and outcomes cannot be trusted perfectly. The way to get around this is, for instance, to calculate many times (formally, you need to have something that is more than two thirds of the time correct to do it this way). Another way is to have a digital computer check the result. The latter is, for instance, possible when you are after answers that are hard to find but easy to check, such as an encryption key. So, what you preferably need is not just a quantum computer, but an error-correcting quantum computer. To have enough room for quantum error correction, you need lots and lots of qubits, which brings us back to the first fly.
But these two, potentially even unsolvable, issues are not even the real problem. For the real problem, you need to dive into the actual working of a quantum computer and why the wrong explanation above is actually wrong.
A true part of the “wrong story” is this: While a 64-bit number in a digital computer can have 264, or 18,446,744,073,709,551,616 distinct values, it can only have one such a value at any time. A 64-qubit computer can—while it is calculating—have all these values simultaneously in a single 64-qubit memory. A comparable 64-bit computer would have to have not one single 64-bit number to operate on, but it would have to have 288,230,376,151,711,744 64-bit values to operate on. Simultaneously. That’s a bit like having 288,230,376,151,711,744 CPUs in your computer. With current digital technology, such a computer would weigh more than 11,529,215,046,068 kilograms and use 288,23,037,615,171 megawatts of power. That’s 14 billion times the total average power that humanity currently uses. For a single computer. That’ll definitely put the final nail in the climate coffin.
So, the 64-qubit computer is extremely efficient while it is operating on all the values in parallel. But then comes reading the calculation result, what in a quantum computing is the measurement of the state of those qubits. And then you run into quantum mechanics’ fundamental property: The measurement result is random. In quantum mechanics, individual events cannot be predicted, only statistical averages can. So, when you read a 64-bit quantum computer’s 64-bit value at the end of operation, you get a random number, not necessarily the answer you are looking for.
Here’s why: Suppose we have a qubit that can be in a mixed state of up and down. Each state can be thought of as a wave. The amplitudes of these waves, when squared, give the probability of that state. And these probabilities (the amplitudes squared) add up to 100 percent. At all times, a qubit has a probability for each of its two possible states, all chances together adding up to 1 (or 100 percent). But when you measure it, it randomly (but with a chance given by those squared amplitudes) becomes one of the states. In other words, when the qubit is in a state where it has a up amplitude of 0.3, it has a 0.09 (or 9 percent) chance of becoming up when measured. It thus has a 91 percent (or 0.91) chance of becoming down when measured. And that means the down amplitude is roughly plus or minus 0.95. 0.3 squared is 0.09. Roughly, 0.95 squared is 0.91. Which means that even with a down amplitude of 0.95, measuring it has a 9 percent chance of producing up when it is read. (compare to this: A bit on digital media that physically produces a 95 percent of the amplitude of a 1 bit reads as a 1 with perfect certainty).