This year's Nobel Prize in physics was awarded to David Wineland and Serge Haroche for work related to the brave new field of quantum computing—which is devoted to building special computers with amazing capabilities based on the rules of quantum physics. Instead of storing data as bits—ordinary zeroes and ones—quantum computers will have exotic variants known as "qubits." So far, only toy quantum computers have been built. In the meantime, they are described in thousands of research articles, including a few by me.
Of course, I was thrilled that the Nobel Foundation recognized this field. However, I was dismayed to read in the press release that "a quantum computer of only 300 qubits could hold 2³⁰⁰ values simultaneously." Actually, 300 qubits can't hold so many zillion values; it is a mathematical fact that 300 qubits can store only 300 bits. This press release is part of a larger pattern of breathless exaggerations. In the name of accessibility, many popular accounts take quantum computing to implausible levels of hype.
What is really going on? You may imagine that a fast computer is a lot like a slower one, only with more or faster transistors. Although partly correct, this misses the more subtle half of computer science, namely advances in computer algorithms. Sometimes you get more out of the transistors if you reorganize their efforts. Even better, while a new computer might be 10 times faster (say), a new algorithm can be "N" times faster. What this means is that an improved algorithm may gain more at larger scales. Even for a basic task such as multiplying large numbers, a smartphone with a good algorithm is faster than a supercomputer with a bad algorithm.
Many computer algorithms make use of random choices. An opinion poll shows the advantage of such a strategy. By taking a random sample of voters, a pollster can get a good estimate of the answers from all voters. In a manner of speaking, one pollster can "simultaneously interview every voter." But not really. Random choices certainly can save work, but for a more subtle reason than that.
Enter quantum physics. The central discovery of quantum mechanics is that the rules of probability must be revised. For example, laser speckle, the twinkling of a laser pointer's spot, contradicts the ordinary rules of chance. Quantum randomness, which reigns in the atomic world, is different from ordinary randomness. The exclamation mark on that fact is quantum algorithms. If ordinary randomness yields faster algorithms, for some problems quantum randomness would be far faster still. This amazing possibility was first proposed by physicist Richard Feynman and established spectacularly by mathematician Peter Shor and others.
But not for all problems. Many computer tasks aren't helped by random choices, much less by quantum randomness. Quantum computers would not, as a New York Times op-ed claimed, "make today’s fastest computers look like hand-cranked adding machines." I am one of about 100 researchers who have found quantum algorithms, and I disagree. Quantum algorithms would work miracles, but only sometimes, and it is an important research effort to determine when. In particular, despite the notorious quantum thought experiment known as "Schrödinger's cat," qubits do not really store all things at once, just as a pollster does not really interview all voters at once.
If laser speckle reveals quantum probability, it is a far cry from a useful computer. For a qubit to be more than just a bit or a useless speck, it must stay secret from the rest of the universe, even from tiny wisps of air. But if qubits are secret, how can you compute with them? In theory, there are ways. In practice, building even a few toy qubits can win you the Nobel Prize.
Quantum computing truly is an amazing field. We should just be careful not to describe it as so amazing as to be nonsense. As a politician might say, hype is not a strategy.