How will gate-based quantum computers scale-up in future?

Awareness of the importance of reducing errors in quantum computers has grown significantly. Individual physical qubits are notoriously vulnerable to decoherence from a variety of noise sources.

DQI Bureau
Updated On
New Update

Quantum computing in future.

Many players are now competing to create a large-scale, fault-tolerant, gate-based quantum computer. In IDTechEx's latest report, "Quantum Computing Market 2024-2044: Technology, Trends, Players, Forecasts", it is predicted that this market will grow with a CAGR of at least 30%, and is on track to become a multi-billion dollar industry. But, how will it get there? 


Breaking 1000 qubit barrier

Building a quantum computer is hard! Building a large-scale, fault-tolerant, gate-based machine to solve a range of complex commercial problems is even harder. 2023 was an exciting year for quantum computing enthusiasts. Both, IBM and Atom Computing announced breaking through the 1000-qubit barrier, a significant milestone on the road to large-scale hardware realization. Moreover, public announcements of contract securements and even delivery of on-premises hardware continued – with examples across almost all major qubit modalities.

Yet, for many, evidence is also mounting that the hype surrounding quantum computing may be past its peak. The number of start-ups emerging appears to have plateaued, and globally, VC reticence is reported to be rising. Despite the growing list of $100 million round closures in quantum -- Photonic Inc., Oxford Quantum Circuits, and Quantinuum, to name just a few – overall, far more capital is flowing the way of artificial intelligence (AI) and biotech.

Worse still – the 'quantum talent crisis' is intensifying, with many players struggling to recruit the physicists, quantum engineers, chip designers, and computer scientists, that they need to grow. Meanwhile, quantum annealing leaders, D-Wave, are already ramping up their product scale capabilities with logistics-focused customers, which could also draw end-user attention away from the gate-based community in the near term.


So, as pressure mounts on the gate-based quantum computing market to utilize its capital and demonstrate that it is on track toward commercial value creation, what are the crucial next steps for hardware developers?

Logical qubit era

Awareness of the importance of reducing errors in quantum computers has grown significantly. Individual physical qubits are notoriously vulnerable to decoherence from a variety of noise sources – from temperature and electromagnetic radiation to crosstalk. Decoherence is catastrophic for quantum advantage, seeing qubits no longer simultaneously represent 1s and 0s, but quite classically, 1s or 0s.

One method of overcoming the impact of noise and decoherence is quantum error correction (QEC). In simple terms, this requires creating abstracted, error-free, logical qubits from a collection of noisy physical qubits. In oversimplified terms, by comparing the properties of the group, enough information about the noise can be extracted to correct it. It is analogous to playing a game of broken telephone enough times to decode the original message.


The exact mathematical approaches to large-scale error correction remain a highly active area of research – particularly by the likes of experts at Riverlane. Yet, the conclusion is clear: the number of logical qubits per system is becoming a more important benchmark for quantum computer hardware's long-term potential for success.

Strikingly, it is apparent that the required ratio of physical to logical qubit varies dramatically between qubit modalities. Evidence suggests that for photonic, it could be as low as 2:1, for neutral atom and trapped ion nearer 10:1 – while superconducting could require more than 1000:1. To some extent, this has temporarily leveled the playing field in the quantum computing market, seeing challengers such as QuEra catch up, if not overtake, giants like IBM and Google in the race for high numbers of logical qubits.

Overall, the need to now transition into a 'logical era' is clear. This is well evidenced by the focus on this benchmark in latest roadmaps by multiple players across the industry. Yet, unfortunately, solely optimizing the system design towards reducing errors will not be enough to secure long-term success. For this, the impact on overall size and power consumption must also be considered.


Reduced infrastructure burden

Overcoming the infrastructure limitations associated with scaling quantum computer hardware is no easy task. Almost all systems today require cooling, whether it be using cryostats or lasers. It is often the cooling system that can be the most demanding on space. However, as efforts to increase logical qubit number increase – space per cooling system to house them is running out.

As a result, today, many hardware roadmaps show a modular approach with multiple systems connected. On the one hand, quantum computing is designed for high-value problems – to be solved over the cloud, and so requiring a large footprint within a data center is not necessarily a huge barrier to adoption. However, in some instances, the associated power demand for this approach for an LSFT machine is calculated to be in the Mega Watts, which is enough to warrant its own small modular reactor. To truly follow the trend of classical computing from vacuum tube to smart phone, it's time to start making components smaller, before capabilities can get bigger.

One key aspect impacting infrastructure demand is the qubit density or the physical size of qubits. Some modalities claim to have a significant advantage in this area over others. For example, it is currently estimated that superconducting and photonic designs could integrate thousands of qubits per chip, trapped-ion tens of thousands, and silicon-spin billions. This is partly limited by the dimensions of the quantum state utilized, as well as the manufacturing methods available to produce them.


Size advantage offered by silicon-spin is largely a result of leveraging the highly optimized techniques that are already adopted by the semiconductor industry for transistor and CMOS manufacture. Notably, Microsoft is working towards hardware-protected Majorana qubits, microns in scale, specifically stating the advantage of enabling a 'single module machine of practical size'. That being said, given the impact of crosstalk and other noise sources, how the impact of spacing between qubits required will change at scale across all modalities remains uncertain.

Furthermore, it cannot be overlooked that as well as the qubit themselves, often the most space is needed for manipulation and readout systems. For example, moving from hundreds to thousands of qubits can lead to unfeasible requirements of microwave cabling, interconnects, lasers and more. As a result, many players are now also developing more optimized approaches for scalable manipulation and control.

SEEQC has created a digital, on-chip alternative to analog control for superconducting qubits which is now of growing interest to other modalities in the ecosystem. Similarly, Oxford Ionics recently patented an 'electronic qubit control', an on-chip interface for trapped-ion modalities. In fact, it is almost ubiquitous focus of research in start-ups and established players to overcome 'the wiring challenge'. Looking ahead, the remaining agile across the quantum stack offers will offer an advantage over vertical integration in this regard.


Societal and market outlook

In this increasingly competitive industry, the coming years will illuminate which strategies hold the greatest promise for securing a lasting quantum commercial advantage. This task will be an uphill balancing act between reducing errors and scaling up logical qubit numbers, while also optimizing for resource efficiency. This is without even considering the gate-speed, algorithm development, and many other crucial factors. The enormity of the task will likely see many players fail to survive until the end of the decade.

Yet, with the market consolidation and convergence of talent, increased clarity should come as to where and when quantum advantage could be offered first, serving only to increase the end-user confidence and engagement. Despite headwinds, world-changing potential of quantum computers within finance, healthcare, sustainability, and security, will remain a tantalizing enough carrot for not only individual companies, but entire nations to chase.

With so many competing quantum computing technologies across a fragmented landscape, determining which approaches are likely to dominate is essential in identifying opportunities within this exciting industry. 

-- IDTechEx, USA.

Quantum computers