Quantum Computing: Available Yet Distant
In the push to accelerate insights from ever more data, enterprises are hoping for a quantum computing breakthrough. Is the technology ready for prime time?
- By Brian J. Dooley
- March 27, 2020
When Google announced in October that it had achieved quantum supremacy -- that its quantum computer performed a calculation that would be impossible or take prohibitively long on a supercomputer -- the tech and business worlds stood up and took notice. Not only did quantum supremacy sound like a huge achievement, but it also seemed to signal that quantum computers would soon be ready for practical use -- something expected in the future but certainly not now. Unfortunately, the details do not support such a conclusion.
The supercomputer they chose was IBM's DoD Summit, currently the fastest computer in the world. IBM immediately returned with an objection, saying if the Summit computer were configured for the problem, it would find the same result in the same amount of time. The calculation itself was a highly specialized random number problem, so it was not, strictly speaking, the kind of calculation that would clearly demonstrate quantum supremacy.
Nonetheless, the win created great interest and re-stimulated the conversation about quantum computers in general. This beneficial result is perhaps more important than the appearance of superiority itself. In any case, the Google announcement is now considered a milestone, demonstrating the current state of the art.
Quantum computers are nowhere near the point of creating a viable working platform for computations of the kind envisioned for this technology. In the current state of quantum computing, the machines are far too unreliable and the viable lifespan of qubits (the quantum bits that make quantum computing possible) before they reach decoherence is extremely brief. There are also very few qubits in any current general-purpose machine. Google was using 53 qubits (it now uses a 72-qubit processor), IBM has a 53-qubit system available to customers, and others are at a similar level. (Canadian company D-Wave has 5,000 qubits in a special-purpose "annealing" machine, but it is devoted to a very limited set of problems).
The second issue is that there is no verification or error-correction method for qubit-based systems. Standard computers have numerous ways of checking validity using extra bits; qubits are problematic in this regard and using a qubit-based verification system would entail adding thousands of qubits to even a small system. These details have yet to be worked out.
A third problem is that the actual processes and limitations of quantum computing are only gradually coming into focus. It is not likely to ever replace conventional computing. One significant and overarching element is that qubits produce a range of answers rather than a single result as we are accustomed to. In other words, there will be a number of results of varying probability. This is only useful for certain types of problems. In fact, problems which have characteristics similar to quantum systems are generally best, and these include simulation, encryption, organic growth, and machine learning, among others.
Apart from the issue of decoherence in quantum systems, qubits also require extreme cooling and are subject to interference from a wide variety of sources. Without sufficient error correction, this becomes quite serious very quickly. There is work being done to create so-called "high temperature" qubits based on carbon nanotubes, silicon, and silicon compounds. (High temperature, in this instance, meaning close to room temperature at most.) This is one of the areas that needs to be developed before quantum computing comes into its own.
Nevertheless, progress is being made in both machine development and in supporting technologies. Programming in quantum computing is aided principally by offering these machines in as-a-service cloud offerings from IBM, Amazon, and Microsoft. This makes it possible for programmers to study algorithms and understand how quantum computing will operate without investing excessive capital to understand, learn, and imagine new quantum possibilities.
The fact that there are a dozen potential architectures for quantum machines permits greater range for experimentation with implementation; for example, D-Wave's relatively huge qubit designs may be limited, but they also provide experience in working with and programming for large qubit systems.
There are some applications currently in use (or nearly being accomplished) on quantum computers, but they are fairly limited and the main intention is generally to understand quantum computing before it arrives at an enterprise's doorstep. The advantage of knowing how to perform quantum computing and how to program critical problems -- such as breaking cryptography, machine learning optimization, and biological processes -- will make it possible to gain an immediate advantage and software patents before the competition can move.
The Shape of Things to Come
Some researchers have pushed back at the idea of any significant near-term commercial applications for quantum computing, saying that it will be a very long time until such machines are ready for business. The verification problem and the extreme low temperatures required, along with the problems of creating more than 100 or so qubits in a general-purpose machine, even after a decade, has dampened expectations a bit until the technology achieves a significant practical breakthrough.
Because of the approximate rather than absolute results, it is also thought that quantum computers will operate mainly as a part of a machine that includes standard computer components, neuromorphic components, and qubits working together to produce a result. This argument makes sense with the current quantum computing limitations and the need to supplement both ML and quantum systems.
After all the furor subsides, the quantum supremacy issue may, after all, be moot. Other, more practical measures of quantum computing are being advanced that may be more appropriate as machines strive toward perfection. These measures include Quantum Volume from IBM, which measures performance based on qubits, coherence, and measurement errors; and Quantum Practicality from Intel, defined as the point at which quantum computers become truly useful.
There are also measures of performance for individual components, such as calibration errors, circuit optimization, coherence, crosstalk, and gate fidelity. This reflects the current status of these systems -- inchoate, intriguing, and maybe coming to wider practical usage this year.
Or maybe not.
Brian J. Dooley is an author, analyst, and journalist with more than 30 years' experience in analyzing and writing about trends in IT. He has written six books, numerous user manuals, hundreds of reports, and more than 1,000 magazine features. You can contact the author at firstname.lastname@example.org.