DisCERNing Quantum – And not as some Shiny-Pink Uni-saurus

Amidst all the Grey Swans, White Rhinos and Pink Flamingo events of the technology forests, lies the magical abode of Quantum. But it’s not magical because it’s all pixie-dust and a fairy cauldron, it’s magical because it’s slow, patient and powerful in the right hands. Let’s see what’s happening inside this future hat - without those rabbits.

author-image
Pratima H
New Update
Sofia-Vallecorsa-Coordinator
Listen to this article
0.75x1x1.5x
00:00/ 00:00

Noise control, fault tolerance, error-correction, superconducting circuits, trapped ions, photonic systems, hardware stability, hardware scalability, algorithmic maturity, strong-enough qubits – everything matters when it comes to the difference between reality and disillusionment with the Quantum Advantage. Sofia Vallecorsa, Coordinator of the CERN Quantum Technology Institute and a distinguished AI and Quantum Computing researcher at CERN dispels some myths and unravels many reality-checks to help us see Quantum Technology with a real lens. Simulations and Experiments in lattice gauge and particle physics are just some of the glimpses of the future-forward work happening in CERN’s brains and labs. But what really stands out is the focus on pragmatic progress, collaboration between BigTech giants and labs and an ecosystem mindset. And knowing that big things come to those who wait- patiently and with the right approach and expectations. And not waiting for the next ‘Shor’.

Advertisment

Have Classical and Quantum computers come any closer in the last four or five years? Why or why not?

In the past four or five years, classical and quantum computing have moved closer through the rise of hybrid models. Rather than replacing classical systems, quantum computers are increasingly seen as complementary resources, acting as accelerators for specific tasks within broader classical workflows. This approach reflects two realities: current quantum devices are still noisy and far from fault-tolerant, and practical usability demands integration rather than isolation.

Hybrid strategies leverage algorithms such as Variational Quantum Algorithms (VQA), which combine quantum subroutines with classical optimisation loops. Infrastructure providers, including major cloud platforms and HPC centers, have embraced this model, offering quantum access through integrated services enabling experimentation without requiring dedicated quantum hardware.

Advertisment

How is this part evolving in the application space?

Applications are emerging in domains where quantum methods offer potential advantages: Monte Carlo sampling for finance, combinatorial optimisation in logistics, and molecular simulations in chemistry. Simulations in the area of fundamental physics are also of great interest. However, these gains remain problem-specific and incremental. Scalability, error rates, and algorithmic maturity still limit widespread adoption, meaning classical systems continue to dominate most workloads.

In short, progress has been substantial in creating usable, integrated environments, but quantum advantage remains narrow. The next step will depend on breakthroughs in error correction and hardware scaling—until then, hybrid computing is the most practical path forward.

How crucial is the under-the-hood part for progress in Quantum computing? Like: What Qubits (logical vs. physical), Quantum chips?

If by ‘under-the-hood’ aspects of quantum computing we mean hardware architectures, qubit technologies, and supporting components (readout chains, I/O infrastructure etc.) they are absolutely crucial for progress. Quantum algorithms and software often capture attention, but their practical impact depends on the physical layer’s ability to scale, reduce errors, and maintain coherence. In addition, access models to quantum resources also have a big impact on the possibility to develop new applications.

Concerning the distinction between physical qubits and logical qubits: physical qubits are the raw hardware units, while logical qubits emerge through error correction, requiring hundreds or thousands of physical qubits per logical one. Achieving fault-tolerant logical qubits is a major bottleneck, and advances in error correction codes and hardware stability directly influence algorithmic feasibility.

Anything you can observe on the developments around memristors, trapped ions, photonics etc.?

The existence of different hardware platforms—superconducting circuits, trapped ions, photonic systems, spin qubits, and emerging memristor-based designs— is beneficial to the design of new algorithms. Different technologies offer different trade-offs in scalability, gate fidelity, and connectivity. For example, superconducting qubits dominate current prototypes due to mature fabrication, while trapped ions excel in coherence but face engineering challenges for large-scale integration. Photonic approaches promise room-temperature operation and easier networking, and memristors could enable neuromorphic-like architectures for quantum-inspired computing. Overall the future might not see a single technology prevailing over all others, since different applications will likely profit from a range of properties.

However one point is important to keep in mind for all of them, without breakthroughs in qubit quality, interconnects, and control electronics, software innovations alone cannot deliver practical gains in real world applications.

We still don’t have something comparable to Shor/Grover algorithm- many experts worry, would you aver? Is QAOA good?

It’s true that we haven’t seen a new algorithm with the same theoretical impact as Shor’s or Grover’s, and this is a concern for some experts. These algorithms demonstrated quantum advantage—but only on fault-tolerant hardware, which remains a long-term goal. In the near term, research focuses on variational and heuristic algorithms, such as the Quantum Approximate Optimization Algorithm (QAOA) and Variational Quantum Eigensolver (VQE).

Quantum algorithms and software often capture attention, but their practical impact depends on the physical layer’s ability to scale, reduce errors, and maintain coherence.

Is QAOA ‘good’? It’s promising for certain combinatorial optimisation problems, but its performance is highly problem-dependent and often comparable to advanced classical heuristics. QAOA is not a universal breakthrough; rather, it’s a practical tool for exploring hybrid quantum-classical workflows on Noisy Intermediate-Scale Quantum (NISQ) devices.

At CERN, through the Quantum Technology Initiative (QTI), we take a pragmatic approach: instead of waiting for the next ‘Shor’, we investigate algorithms that can address real scientific challenges today. This includes quantum-inspired methods, hybrid strategies for simulation and optimisation, quantum machine learning and benchmarking to understand where quantum can truly add value. The absence of a new landmark algorithm reflects the complexity of the field, not stagnation. Future breakthroughs will likely come from co-design efforts, where hardware constraints and domain-specific problems shape algorithm development.

What’s stopping Quantum from rising and shining at a scale and brilliance that the hype pegs it for? How serious are issues like Quantum dissipation (and can dissipation engineering be a good idea), Error correction, Qubit decoherence and hardware fragility?

One of the biggest obstacles to quantum computing’s ‘rise and shine’ is the gap between hype and reality. Over the past decade, quantum technologies have been portrayed as a near-magical solution to every computational challenge. This narrative creates unrealistic expectations—both in the public and among investors—which can lead to disillusionment when progress is slower than promised.

Quantum computing is fundamentally hard: scaling qubits, achieving fault tolerance, and reducing error rates are challenges that require years of sustained research. When hype overshadows these realities, resources may be misallocated, and pressure mounts to demonstrate short-term breakthroughs rather than long-term foundational progress.

Another issue is the tendency to apply quantum computing indiscriminately. Not every problem benefits from quantum methods; many tasks remain better suited to classical or hybrid approaches. Blindly forcing quantum solutions where they don’t fit wastes effort and risks undermining credibility.

Ultimately, quantum computing will deliver transformative impact—but only in specific domains and only when hardware and algorithms mature. Overhyping the technology can slow this journey by eroding trust and diverting attention from realistic, incremental milestones. The path forward requires balanced communication: celebrating progress without promising miracles.

Tell us anything you can share about the work happening at CERN? And why is it significant? Like QTI, Worldwide LHC Grid, Use of quantum algorithms for lattice simulations, quantum optimization for grid computing, open Quantum institute, supersymmetry searches, anything?

CERN is deeply engaged in quantum research through the Quantum Technology Initiative (QTI): Launched in 2020, it plays a dual role: advancing quantum technologies for scientific discovery and exploring how CERN’s own technological expertise—developed for particle physics—can accelerate quantum innovation. This includes leveraging CERN’s strengths in supeconductivity, cryogenics, control systems, and scientific computing to support quantum hardware and software development.

A major research focus is on quantum algorithms for theoretical simulations, such as lattice gauge theories, which are central to understanding the fundamental forces of nature. These problems are computationally intensive on classical systems, and quantum methods could eventually provide exponential speed-ups. In parallel, CERN investigates how quantum algorithms might help interpret experimental particle physics data, which encodes signatures of underlying quantum processes from high-energy collisions.

Quantum computing will bring transformative impact, but only in select domains as hardware and algorithms mature. Overhyping it can hinder progress by eroding trust and diverting focus from realistic milestones.

The Open Quantum Institute (OQI) complements QTI but serves a different purpose: it is a global platform for education, policy, and societal engagement around quantum technologies. While QTI drives research and technical development, OQI fosters inclusive access, ethical frameworks, and capacity building to ensure quantum advances benefit science and society broadly.

This work is significant because it positions CERN not only as a user of quantum technologies but as a contributor to their evolution. By aligning quantum research with real scientific challenges and leveraging CERN’s engineering expertise, these initiatives aim to create meaningful progress rather than hype-driven applications.

Would academic/lab work race ahead of BigTech (IBM, Google etc.)’s work in this area? How important and pragmatic are collaboration and open ecosystem here?

Academic research and industrial efforts in quantum computing should not be seen as competitors—they are complementary and interdependent. BigTech companies bring engineering scale, fabrication capabilities, and cloud infrastructure, while academic and lab environments contribute deep theoretical insight, algorithmic innovation, and domain-specific expertise. Progress in quantum technologies depends on these two worlds working together.

For CERN’s Quantum Technology Initiative (QTI), collaboration is particularly critical. CERN’s mission is to advance fundamental science, but the technologies developed for particle physics can also accelerate quantum hardware and software development. By partnering with industry, CERN ensures that these capabilities feed into a broader ecosystem, while CERN benefits from industrial advances in hardware platforms and tooling.

An open ecosystem is equally important. Quantum computing is still in its formative stage; closed silos risk slowing innovation and creating fragmented standards. Initiatives like QTI actively promote open science, shared benchmarks, and co-design approaches, ensuring that algorithms, hardware, and applications evolve together.

In short, collaboration is not just pragmatic—it’s essential. The complexity of quantum computing means no single actor can succeed alone. CERN’s approach reflects this reality: building bridges between academia, industry, and policy to create sustainable progress.

pratimah@cybermedia.co.in