The Death of Quantum Supremacy and Birth of Quantum Advantage
A new way of thinking about quantum computing sets more realistic goals for this technology.
- By Rob Enderle
- April 13, 2023
Quantum computing has been under development for decades and it is a potential game changer in several areas, including networking, security, and extremely large data set analysis. You probably know that generative AI is disrupting markets ranging from advertising and search to computer interfaces (including automotive). Similarly, quantum computing will also be massively disruptive and could be applied to make generative AI massively more powerful.
The State of Quantum Computing
Currently, quantum computing is only truly relevant to those working on advanced computer development and research, in advanced encryption, or with extremely advanced high-speed networking, and who need to work with data sets that existing supercomputers struggle with. If you want to model the world’s weather accurately, for instance, the size of the related data set would bring even the most powerful supercomputer to its knees. A quantum computer is designed to take on such tasks thanks to its ability to massively -- and near instantly -- multitask.
Encryption is a major area of interest because the security industry believes that a quantum computer could decrypt any existing encrypted file almost instantly. IBM, which has been a leader in quantum hardware development, has released quantum-resistant cryptography algorithms to hopefully protect data until quantum encryption is developed.
In networking, we are talking about quantum pairs, which act in concert across unlimited distances and could be used for faster-than-light communications. We are just beginning to explore the potential for this use of quantum technology and while this could have huge implications for space exploration, military and transport (remote controlled systems), and telepresence (surgery and other areas where latency might be problematic), use is still over a decade away.
Applying quantum computing to large scale data set analysis will change the nature of supercomputers. However, viable quantum computers with enough power to function are still off in the future, with estimates for viability decades out. However, a recent change in how we look at quantum computers will bring this date closer because these computers better leverage existing computing technology and put quantum computing on a par with other technologies.
The Recurring Technology Introduction Problem
Whenever new technology comes out, there is a tendency to overestimate what it is capable of and to suggest it will displace all that came before it. If that were always the case, we wouldn’t still have fax machines and traditional snail mail. They would have been replaced by email decades ago. The concept of quantum supremacy was derived from that same misconception: a belief that quantum computing would displace all other types of computing. Not only is that unlikely to happen soon (we still don’t have a viable, business-ready quantum computer), it may never happen.
The quantum processor is a tool, like the CPU, GPU, and NPU (neural processing unit), and although it may be able to perform tasks better than older technologies, using a quantum computer for everyday work would be like putting a jet engine in a car. In other words, it would be a waste of a technology that is better suited for extremely large data set analysis, certain types of extremely secure communications, and specific workloads for which older technologies aren’t as well suited.
That is the concept of quantum advantage: putting the technology where it will enhance, not replace, what came before.
Going back to the birth of client server computing and the PC, we discovered that PCs made lousy terminals until the invention of the internet and internet browsers. Now, smartphones are often the preferred tool, but they didn’t replace PCs so much as supplement them.
Today, mainframes work in parallel with servers. Each has advantages regarding particular workloads. Desktop PCs haven’t been fully displaced by laptops because they provide a less expensive way to get higher performance, and they are more difficult to steal, damage, or lose than their more mobile counterparts.
We make a recurring mistake when we assume that a new technology will fully displace what came before. Often we quickly discover what came before may still be better (more focused, more energy efficient, more familiar) than that newer technology. It is usually a better decision to supplement with and learn about a new technology first before displacing your aging solutions with something new before it is ready.
Apparently, the computing industry is realizing that prematurely positioning a new technology as more disruptive than it is likely to be only harms the technology and calls attention to some of the typical failed deployments that came before. This overpromising and underdelivering can result in companies moving too quickly to buy and implement the technology inefficiently, creating a negative impression of the technology’s capabilities and a typical pull-back when it fails to meet overinflated expectations.
This burns early adopters as well as the firms that attempt to create solutions only to have them fail in the market.
Quantum supremacy is achieved when a quantum computer can outperform a more traditional computer (which now seems foolish). Quantum advantage, which pairs quantum technology with CPUs, GPUs, and potentially NPUs, creates a blended solution that can handle today’s and tomorrow’s jobs.
Most of us will only access quantum computing on a cloud instance because the cost of a quantum computer will be in the supercomputer range (and likely more expensive initially). Another limitation will be the time it takes to transfer these massive files and the remote storage needed to store them, suggesting those enterprises planning to use this technology will need to buy additional storage until data storage and networking capabilities catch up to this need. So although you can train using cloud resources, operationally enterprises will likely need to use quantum computing on premises near term once the technology becomes truly viable.
Think about using quantum technology similarly to how you’d use a hardware accelerator in a HPC implementation. Much like any processor type, it would be called on as needed when a job required it. It becomes a shared resource, which is a good thing given it is likely to be extremely expensive at first due to its rarity. This last factor makes it critical that only jobs that can best make use of this technology be allowed to use it.
Similar to other advanced technologies, we started out with quantum computing by over-hyping what it can do and implying that quantum computers would replace all other prior forms with the goal of quantum supremacy announcing their obsolescence. However, the industry seems to be getting smarter and is pivoting to the concept of quantum advantage by treating quantum computing as just a tool focused on the most massive data sets, the most complex analytics, and the most secure communications.
The quantum advantage niche is also far more achievable because it focuses the technology where we know it will perform well and cedes other areas to existing technologies that are currently better and cheaper at performing related tasks.
This change should not only help speed quantum computers to market, it should also make them far more able to meet the revised performance bar when they finally get here in a few years.