In 1981, computing giant IBM collaborated with the Massachusetts Institute of Technology’s Laboratory for Computer Science for a conference, “The Physics of Computation.” At that event, Nobel laureate Richard Feynman called for the creation of quantum computers to simulate quantum physics, famously saying, “Nature isn’t classical, and if you want to make a simulation of nature, you’d better make it quantum mechanical.”
Last year, IBM Senior Vice President and Director of Research Dr. Dario Gil said, “The big bang of quantum computing will happen in this decade.” Researchers in government and industry continue to work toward quantum advantage — an as-yet-unachieved stage where a quantum computer surpasses the performance of a conventional computer to accomplish a practical task. This year, IBM and the University of California, Berkeley proved “quantum utility” — a point at which quantum computers could serve as scientific tools to explore a new scale of problems that classical systems may never be able to solve — when they demonstrated that a quantum system with a 127-qubit processor could turn out accurate simulations of the dynamics of mechanical systems (known as an Ising model). With the help of advanced error mitigation techniques, the quantum computer accurately predicted properties such as a material’s magnetization, even as the classical computing methods eventually faltered on a supercomputer.
In August, technology market analyst firm International Data Corporation (IDC) forecast spending on quantum computing to grow to $7.6 billion by 2027, up from $412 million in 2020. That’s a compound annual growth rate of more than 50% a year from 2021 to 2027.
Heather West, Ph.D., research manager for quantum computing at IDC, noted that quantum computing is expected to be an industry disruptor that has the potential to lead to a competitive advantage. As a result, companies in all industries are experimenting with quantum technology now to identify use cases and develop quantum algorithms to prepare for quantum advantage. Such companies include the Cleveland Clinic in healthcare, JPMorgan Chase & Co. in finance, Ford Motor Company in the automotive industry, The Boeing Company in aerospace, BASF in chemicals, Merck & Co., Inc., in pharmaceuticals, and The MITRE Corporation supporting national security applications. This activity, in turn, drives even more interest and investment in quantum technology.
“Because quantum computing is a complex technology, which differs significantly from classical compute technology, there is a steep learning curve. Companies interested in staying competitive using quantum should consider experimenting today,” West said.
A Synthetic Window Into the Natural World
“The list of quantum computing use cases is extensive,” West said. “Using quantum computers, scientists and engineers will have the compute power to simulate natural processes, which could lead to more efficient and eco-friendly batteries, or catalysts to combat climate change.”
“What quantum computers have potential to do is help us better understand nature by accurately modeling natural systems, and thus the very fundamentals of how the world itself works,” said Youngseok Kim, research staff member at IBM Quantum.
Kim added that quantum technology can enable these advances because it uses superposition, entanglement, and interference, fundamental properties of nature. In traditional computing, the basic unit is the bit, which can be either 0 or 1 but nothing in between. In quantum computing, the basic unit is the quantum bit, or qubit. It can be a 0, 1, or any value in between because quantum systems exist as a combination of superposed states.
At the quantum level, atoms or other particles exist in multiple states and interact through interference. Entanglement, which Albert Einstein referred to as “spooky action at a distance,” makes it possible to transmit quantum information, like the state of a qubit, over great distances.
Moving Beyond Traditional Computing
As quantum hardware improves, it has the potential to solve problems a conventional computer cannot. Encryption, for instance, depends on the fact that multiplying two numbers is easy, but going the other way — factoring a number into two components — is not. This difficulty grows exponentially greater for a traditional computer as the numbers involved grow larger. For a quantum circuit, though, the workload increases linearly. Consequently, while factoring a large number is effectively impossible for a conventional computer, that may not be the case for future quantum technology.
RSA cryptography, one of the oldest and most widespread cryptosystems, could be broken using large-scale quantum computers. So could elliptic-curve cryptography (ECC), an approach often used in digital signatures. Algorithms have been discovered for quantum computers that can, unlike conventional computers, efficiently reverse the mathematical operations at the heart of both systems.
West stressed that quantum computing is still a nascent technology, noting that quantum hardware developers are still challenged in their ability to develop and scale high-quality qubits that can perform calculations and solve complex problems faster, more cost efficiently, and more accurately than a classical computer. Currently, it is unknown if certain quantum modalities will be better suited for certain use cases, when quantum advantage will be achieved, and for what industry.
The number of qubits in today’s devices are small when compared to the gigabits of a conventional computer. IBM’s largest quantum system to date is made up of 433 qubits, while other contenders have systems with even fewer. IBM has plans to scale the number of qubits, first to 1,000 and then perhaps to as many as 100,000 by 2030. Other quantum computing contenders are also promising to build systems with more high-quality qubits that perform at high fidelity.
However, quantum systems are inherently noisy and much more error-prone than conventional circuitry. Barriers shield qubits from stray electromagnetic fields and sounds because doing so improves performance. Even so, consensus has historically held that a million qubits might be needed to do a useful calculation because some quantum circuitry would be used to check on the work of other parts of the system. While this error-correcting overhead can be quite large, the number needed could be much less than a million.
In May 2023, IBM announced at the launch of its Osprey quantum processor, with 433 qubits, that it was accessible as an “exploratory technical demonstration” through the company’s cloud — a major improvement over the 27 qubits IBM achieved with its Falcon processor in 2019, and more than triple the 127 qubits on the IBM Eagle processor, unveiled in 2021. The company plans to build a quantum processor with more than 4,000 qubits by 2025.
One reason IBM’s 127-qubit Eagle processor achieved the published “utility-scale” results is that the latest version of qubits performs better than previous ones, Kim said. A second reason is that researchers devised a way to overcome the effects of today’s imperfect qubits.
“Error mitigation allows us to apply certain methods to the results we obtain from a ‘noisy’ quantum computer, and thus obtain accurate calculations before our industry reaches a state of full error correction,” Kim explained.
By leveraging the ability to maintain long-range connections within a quantum computer, quantum computing startup PsiQuantum announced a 700-fold reduction in the computational resource requirements for breaking ECC keys relative to state-of-the-art quantum algorithms.
Innovators Take Diverse Approaches
In working toward a useful quantum computer, companies and researchers are pursuing diverse ways to implement the technology. PsiQuantum, for instance, is building a large-scale fault-tolerant quantum computer using photons. This enables the company to leverage the billions of dollars that have been invested into the mature semiconductor industry and has considerable advantages in scaling up, said Peter Shadbolt, the company’s co-founder and chief scientific officer. IBM and others use superconductors sitting atop a silicon substrate. Firms and researchers are also using trapped ions, investigating the spin of electrons in carbon nanotubes, and researching other particles to serve as the basis for qubits.
Today’s qubit approaches use superconducting materials in either the qubits or the detectors, which can require chilling systems to near absolute zero. However, it might be possible to build non-superconducting qubits that operate at higher temperatures. Photons and trapped ions, for example, might lead to useful room-temperature qubits.
Even under the current circumstances, quantum technology is no longer confined to a lab. IBM collaborates with companies around the world on how to use quantum computing. PsiQuantum is making devices on thousands of silicon wafers in a Tier 1 high-volume semiconductor fabrication facility as part of its effort to build a quantum computer, Shadbolt said.
A broadly useful quantum computer may still be years off. Quantum computers are likely to be expensive machines with shared time doled out to academic and industrial researchers. Due to the fragile nature of qubits and the need to cool parts of the systems to hundreds of degrees below the freezing point of water, many quantum computers will be housed in large facilities with extensive support staff and equipment, not unlike the large servers used by cloud computing providers.
Finally, although quantum computers are good at solving some problems, they are not the best tool for solving every problem. Optimization, for instance, is an area where conventional computers often produce good enough results through finely tuned algorithms.
Quantum technology will have areas where it shines and other applications where the traditional approach, found in smartphones, tablets, laptops, and supercomputers, will still prevail. Thus, the two technologies will complement each other.
As Kim said, “It’s important to note that quantum computing will not replace classical computing.”