Skip to main content

Great leaps forward


Petascale computing is on the way, and NCSA will play a key role in enabling scientists and engineers to take full advantage of this increased power. Drawing on years of experience and the expertise of its staff, NCSA will help researchers scale their codes to effectively use tens and hundreds of thousands of processor cores. And the center’s Innovative Systems Laboratory, in collaboration with colleagues at the University of Illinois, is exploring how new architectures like GPUs and FPGAs can take scientific computing to the petascale and beyond. How will petascale computing advance science and engineering? These researchers describe how their work will be transformed by the increased power that is on the horizon.

Jeroen Tromp, Caltech
Human beings have explored our planet’s icy poles, vast oceans, treacherous peaks, and remote jungles. But one frontier remains mysterious: the Earth’s interior. The depths of the mantle and core are, despite what Hollywood blockbusters claim, not hospitable to manned expeditions; instead, seismologists must rely on computational simulation.

According to Caltech seismologist Jeroen Tromp, there are two challenges that petascale computation will enable researchers to address.

“The first will require harnessing the entire machine to do very high-resolution simulations” of seismic wave propagation across the entire globe. “A machine like that should get us into the 1 to 2 Hz range,” which Tromp explains will provide unprecedented insight into the small-scale structure of our planet’s interior.

For example, where the liquid outer core meets the solid lower mantle, “all kinds of things appear to be happening at that boundary,” he says. The mantle appears to undergo a phase transition at this depth, changing its properties and behavior. “We would love to be able to really image that. “The fundamental question is: How does the planet work in terms of its physics and chemistry?”

The second grand challenge that petascale computing will enable seismologists to tackle is running many large simulations simultaneously for hundreds or even thousands of earthquakes in order to improve models of the earth’s interior, resolving smaller-scale structures. For this task, high-performance I/O will be critical, to ensure that massive quantities of data can be transferred, stored, and analyzed.

David Nolan, University of Miami
The Weather Research and Forecasting Model (WRF) is used by thousands of researchers across the country and around the world to study the formation of dangerous hurricanes and tornadoes, shifts in climate, and air-quality issues. Many variables interact in these complex systems, and the model is not yet able to capture all of the small-scale features of interest to researchers—and current computing systems aren’t powerful enough to provide such fine resolution.

“When wind is blowing over the ocean, for example, that creates turbulence, and our computer models can’t see those swirls and eddies,” says David Nolan, a hurricane researcher at the University of Miami, Florida. Various approximations are used to account for the turbulence, and for other small-scale phenomena.

With petascale computing, researchers will be able to directly simulate phenomena down to a scale of 100 meters. By comparing these fine-grained simulations with current approximations, researchers will be able to determine whether those approximations are sufficient or, if they are not, how to improve the model to better capture what happens in the real world.

“We’ll be able to get much closer to reality,” Nolan says, which means understanding better where hurricanes will form and how their intensity can shift. “It simply comes down to better hurricane forecasting.”

David Baker, University of Washington
“Proteins are the miniature machines that do everything in the human body,” says David Baker, a biochemist at the Howard Hughes Medical Institute at the University of Washington.

Understanding proteins—which ones perform which duties, how they operate, and what happens when they fail to function correctly—provides tremendous insights into the mechanics of life, the roots of disease, and how to better treat illness.

The determination of which amino acids are present in a particular protein, and in what order—the protein sequence—is proceeding at a rapid clip. But each amino acid string is folded and tangled into a complex three-dimensional structure; the unique structure of each protein gives it a unique function. Determining protein structures is lagging behind the generation of sequences.

Baker arrives at protein structure predictions computationally, using his Rosetta code to look for the shape of lowest energy among the possible forms the protein can take. “The bigger the protein is—the more moving parts it has, essentially—the larger the number of possible states,” he explains.

Baker likens the quest to searching for the lowest elevation on the planet. A single explorer would have to search and search and search and search and search. Multiple explorers can cover more ground in less time.

“More explorers increase your chances of success,” he says. Likewise, when it comes to protein structures “to really explore the space, you need a large number of processors. Each additional bit of computer power is more conformations that can be searched.” Making the leap to petascale computation will mean searching a much larger area much more efficiently, and therefore unlocking more of the secrets proteins hold.

Paul Fussell, Boeing
High-performance computing is central to Boeing’s businesses. For example, the company uses tens of thousands of computational fluid dynamics (CFD) simulations to evaluate designs and systematically explore the possibilities of design improvements for its aircraft.

“Computational excellence is an enabler to our ability to lead aerospace technology and product innovation to the marketplace,” says Paul Fussell, senior manager of mathematical modeling for Boeing Engineering, Operations and Technology.

Using CFD to analyze the aerodynamics of the aircraft under certain conditions, like cruise or one-engine takeoff, is just the starting point. With petascale computing power, Boeing will be able to undertake much more complex simulations at much finer fidelity.

Petascale simulations could include more extensive aerodynamics and would enable optimization considering several physics, rather than just one. For example, simulations could include wing structure as well as aerodynamics in order to better capture the complex interplay occurring as the plane’s wing bends, influencing its aerodynamics, which affect the load and elastic response of the wing. With this computing capacity, Boeing could design better materials, consider structure and airframe, and study their response to dynamic loads.

“We always use all the computational capability on our machine room floors to solve the most significant challenges we face. And the bar is always rising; we always imagine how we’ll use the next increase in machine capability. We always know of more physics to consider or more design-spaces to explore,” Fussell says.

Disclaimer: Due to changes in website systems, we've adjusted archived content to fit the present-day site and the articles will not appear in their original published format. Formatting, header information, photographs and other illustrations are not available in archived articles.

Back to top