Skip to main content

Santiago Núñez-Corrales on ‘NCSA’s Mission in Quantum Computing’

Editor’s note: This is part of a series of virtual essays from NCSA experts on current topics impacting the field of high-performance computing and research.

NCSA’s Mission in Quantum Computing
By Santiago Núñez-Corrales, NCSA Quantum Lead Research Scientist

The fact that physical laws in our universe contain the recipe to perform computation is nothing short of extraordinary. John Archibald Wheeler described how intricate and intense the relationship is between physics and information in his foundational paper in 1991, one that bears profound consequences for science, technology and even aesthetics. That relationship is so strong that mathematical logic allowed Alan Turing to imagine an abstract computing device for which one can build and operate actual mechanisms using energy and matter. One may very much say that we owe a large share of our contemporary human experience, from supercomputers to the internet, to what Eugene Wigner called the “Unreasonable Effectiveness of Mathematics.”

Quantum computing, first described by Yuri Manin, Paul Benioff, Richard Feynman and David Deutsch in the 1980s as the theoretical possibility of using quantum mechanics to solve quantum mechanical problems faster than with classical computers, takes our extraordinary relationship with computation and information to new heights. From this early work to the burgeoning quantum computing ecosystem of today, a lot has happened. Physicists and engineers have figured out how to build devices that host quantum bits (qubits) and maintain their state long enough to perform preliminary useful work. A whole field – the study of quantum algorithms – has been established to understand which kinds of problems are soluble with these systems and how efficient these solutions may be.

Quantum hardware has morphed from experiments in academic laboratories into products that are either sold as deployable units or can be accessed via the cloud. From big tech industry names to hundreds of new startups, capital investment across government and the private sector has fostered the establishment of a market of around $1.5 billion by 2026 as well as potential economic value in the trillions of dollars by 2035. Behind the maturation of the quantum computing market rests the expectation of greater ability to solve hard problems, problems of the sort that can have major impacts for the economy and some of the grand challenges of our time that remain unreachable with our traditional computing resources.

At the same time, we must constantly remind ourselves that quantum computing is moving from its infancy into a yet-troubled adolescence. As with any advanced technology, it also undergoes the Gartner Hype Cycle and we seem to be squarely in the hype phase. A relatively large variety of classes of hardware platforms whose design is in flux indicates we have not yet arrived at a definitive technology that will become standard. These are still Noisy Intermediate-Scale Quantum devices that produce unreliable results due to their susceptibility to perturbations of various sorts. The fact that quantum devices tend to operate in temperature and vacuum regimes outside usual values found anywhere in the universe is telling of how delicate quantum states are. Fault tolerance remains distant on the horizon, albeit recent advances point in that direction. And there are fundamental questions about how scalable these devices are, and particularly the scalability of entanglement, a key quantum resource.

What we know so far about situations in which quantum algorithms may be effective suggests a small number of problems with large speedups, others with more modest speedups and many where there is no advantage. At a more abstract level, the current state of the art suggests that the classes of problems solvable by classical computers are the same as the classes of problems solved by quantum computers even if the latter have more physical resources than the former, though the mathematical questions behind it need to be better investigated. We still program quantum devices with very low-level languages, either sending pulses or specifying circuits, which is both painstaking and inefficient. Contrast this against the vast landscape of high-level programming languages today, from C to Python and JavaScript, which affords us even the possibility of arguing about which ones we prefer and which ones we dislike.

In short, the state of quantum computing today bears a striking resemblance to that of classical computing in the 1950s. Transistors were still experimentally used in hardware, stored programs had only recently been implemented in repeatable – yet clunky – ways and computer architecture was untrodden land. I wish to suggest that, precisely because of the point we are at in the evolution of quantum computing, the most productive stance is evidence-restrained optimism: both the promises and the difficulties are real, and closing the gap requires a village. And at the University of Illinois, we fortunately have a large one.

Santiago Núñez-Corrales, NCSA Quantum Lead Research Scientist

What should our mission at NCSA be in quantum computing? As our Director Bill Gropp keenly points out, the “A” in NCSA stands for “Applications,” a natural calling to harness advanced computing and scientific software to catalyze discoveries and ultimately enable positive impacts across science and society. Quantum computing fits naturally at the core of it. For the better part of two years, we have crafted a quantum computing vision that is scientifically informed and tempered by the expertise of colleagues in the Illinois Quantum Information Science and Technology Center and fueled by our history of achievements.

As a result, three main targets explain the bulk of recent work at NCSA. The first is to contribute to the advancement of quantum computing platforms toward dependable quantum cyberinfrastructure. Ongoing work toward digital twins for superconducting quantum devices and HPC-QPU integration has allowed us to establish deep and enriching relationships with quantum experimentalists on campus. 

The second is to influence the quantum software ecosystem in ways that increase the productivity of developers through creating new high-level quantum programming languages and exploring what a quantum analogue of Message Passing Interface (MPI) may look like to write software transparently across distributed classical-quantum infrastructure in the future. 

Finally, our staff has begun gearing up to serve the needs of research scientists seeking new possibilities in quantum technologies while also leading the 1st Workshop on Broadly Accessible Quantum Computing at PEARC24 in July, providing participants with a comprehensive understanding of the current status and the prospects of quantum computing (QC) and its applications, focusing on how it can benefit the broader community interested in integrating quantum technologies into their traditional research computing facilities.

Whatever form the quantum computing revolution takes, NCSA stands ready to contribute.

Back to top