February 22, 2024

Quantum computing is coming sooner than you think

It seems that for every proponent of quantum computing, there is also an opponent. Its opponents often call quantum computing “a science project,” “hype,” “a hoax,” and even a “failed case.” If you look back at the history of the technology industry, it is littered with technologies that failed for various technical or business reasons. So there is reason to be skeptical. However, there are just as many technologies that have charted the future direction of innovation because of the great advancements that this technology made possible. Some even had a similar level, if not more, of skepticism and of being “a science project” – technologies such as artificial intelligence (AI). AI is a concept that was theorized long before the development of the first silicon transistor, but only became a reality in the past decade with advances in silicon technology, processing architectures, and deep learning techniques. Likewise, quantum computing technology is now real and poised for a breakthrough in the next decade.

Quantum computers are not simple

Even describing the concept of quantum computing is not easy. Classical computers use bits to represent a one (on state) or zero (off state), while quantum computers use qubits that can represent multiple states through superposition and links with other qubits through entanglement. The result is a computer that scales exponentially in terms of computing capacity. While this makes quantum computers ideally suited for large mathematical models, they are not suitable for performing the simple overhead tasks associated with computing. As a result, quantum computing is better positioned as a new accelerator technology, similar to a Graphics Processing Unit (GPU), Digital Signal Processor (DSP) or Field-Programmable Gate Array (FPGA), but on a much larger scale in terms of computing performance. However, quantum computers require specialized control logic and memory due to the unique computing architecture on which quantum computers are based. Large cooling units are also required because they operate at almost absolute zero, i.e. zero degrees Kelvin or -273.15 degrees Celsius.

Quantum computing also faces two major challenges: accuracy and scalability. Errors are introduced by both the stability (or lack thereof) of qubits and potential interference from other qubits. Maintaining the stability or lifetime of a qubit in a superposition state is challenging and can be limited to a few milliseconds or microseconds. Furthermore, qubits can interfere with adjacent qubits. As a result, error suppression, correction, and mitigation techniques are being developed to work both individually and collaboratively to increase computational accuracy. Error suppression uses front-end processing based on knowledge of the system and circuitry to compensate for potential errors, such as making changes to the pulses that control the qubits. Error mitigation corrects errors in post-processing based on a noise model. Error correction, on the other hand, requires many additional qubits to correct errors during execution. While error correction can be the most effective way to eliminate errors, it comes at a significant cost. However, while suppressing and limiting errors, quantum computing still enables processing at a level that cannot be easily achieved even on the largest classical supercomputers.

Scaling up quantum computers is also a major challenge. While several quantum solutions exist, many do not use standard CMOS manufacturing processes, meaning they cannot scale with the advanced semiconductor processes used for other high-end processors or accelerators. Furthermore, the entire system must scale with the number of qubits, which means more wires are needed connecting each individual qubit to the control logic, plus its associated heat sinks. If you look at today’s quantum computers when they’re not in the refrigerator, they look more like a tangle of tubes and wires than a silicon-based system. Scaling these systems is not an easy task.

Rapid progress in quantum computing

If quantum computing is so full of challenges, the logical question is: why do I think we are on the cusp of major advances in quantum computing? One of the reasons is the level of investment in quantum computing. The benefits of having a single computer that can outperform many supercomputers are so valuable that the scientific community, technology industry, governments and corporations are investing billions in the development and use of quantum computing. This includes industry leaders such as Alibaba, Amazon, IBM, Intel, Google, Honeywell, Microsoft, Nvidia and Toshiba and many other companies. Likewise, the US government has a National Quantum Initiative to “accelerate quantum research and development for the economic and national security of the United States.” A key example of this investment is walking through the IBM Quantum Data Center in Poughkeepsie, New York, which I had the opportunity to tour earlier this year.

Another reason is that the continued advances being made in quantum computing are improvements in quantum chips, control logic, systems and software. These improvements mainly apply to the development tools for mitigating, suppressing, and correcting errors. For example, IBM has led the way in quantum scaling with the 433-qubit Osprey processor introduced in 2022 and will introduce the 1,121-qubit Condor processor later this year. If you consider IBM’s quantum processor roadmap, the number of qubits will increase by about 2-3x every year. IBM is also linking quantum computers together to further increase qubit capacity. IBM has indicated a goal of 100,000 qubit systems by 2033. Industry and science are already working on practical applications with current quantum computers. This development will accelerate as qubit capacity increases in the second half of this decade.

The final reason, and the one I believe will be critical to the next step in quantum computing, is artificial intelligence (AI). Until now, the focus has been on integrating classical computers with quantum computers. However, AI has the potential to both improve the capabilities and performance of quantum computers and be improved by quantum computers, but work in this area has only just begun.

Quantum timeline

When and how will quantum computing become available for practical applications? With thousands of universities, research organizations and enterprises already learning and experimenting with quantum computing, the answer is: now, for some limited applications. As published in the scientific journal Nature, IBM collaborated with America’s Berkley to demonstrate the ability of quantum computers with just 127 qubits to outperform classical computers in materials modeling. However, IBM believes that the 100,000 qubit capacity level will mark a turning point for the industry. Now that quantum systems are networked together, this threshold is rapidly approaching.

How the quantum computer industry will take shape is a little easier to predict. Due to the high investment in the supporting systems and infrastructure to support the systems, quantum computing will likely be a cloud service offered by the leading hyperscalers and/or technology providers for the vast majority of the market – at least for the foreseeable future. There will be some university and corporate installations, but these will likely be few and far between.

MORE FROM FORBESQuantum-safe cryptography: a giant leap forward is now needed

The Quantum Age

Given the amount of investment, development and activity in quantum computing, the industry is ready for a dynamic change, similar to that brought about by AI: better performance, functionality and intelligence. This also brings with it the same challenges that AI brings, such as security, as described in the recent Quantum Safe Cryptography article. But just like AI, quantum computing is coming. You could say that quantum computing is where AI was in 2015, fascinating but not widely used. Just five years later, AI was integrated into almost every platform and application. In just five years, quantum computing could take computing and humanity to a new level of knowledge and understanding.

Leave a Reply

Your email address will not be published. Required fields are marked *