Skip to main content

‘People’s lives are affected’

by J. William Bell

The Blue Waters sustained-petascale supercomputer will help researchers figure out ways to blunt climate change and develop local strategies for living with the changes that do occur.

A scientific consensus has emerged: The Earth’s climate is changing, and human behavior is accelerating that change.

Climate modeling on supercomputers played a huge role in establishing those facts. It’s also being used to guide strategies for blunting the negative impact that climate change will have. Multi-trillion dollar decisions, by the reckoning of the United Kingdom’s Office of Climate Change, will be made based on the predictions of our planet’s future that can be made by leading scientists using leading-edge supercomputers.

The Blue Waters sustained-petascale system coming to NCSA is likely to be one of the supercomputers delivering those insights—thanks to partners from the Institute of Global Environment and Society’s Center for Ocean-Land-Atmosphere Studies (COLA), the University of Miami, the University Corporation for Atmospheric Research, and Colorado State University.

Blue Waters, and world-class supercomputers like it, give these researchers “a place to test ideas and not be fettered by computational constraints,” or at least be fettered by fewer constraints, according to Jim Kinter, COLA’s director.

Blue Waters’ large and fast memory will help researchers get over some of those constraints. Tightly coupled shared-memory nodes will deliver 512 GB/s of aggregate memory bandwidth and 192 GB/s of bandwidth to a hub chip used for I/O, messaging, and switching. The hub chip delivers a total of 1,128 GB/s peak bandwidth.

Blue Waters’ large archive—expected to be as much as half an exabyte—will also be important.

“Storage capacity is always a problem” for climate modelers, says COLA’s Cristiana Stan. A single run can produce hundreds of terabytes of data, and the modelers “never destroy the data. We want to be able to analyze it and compare it with past results. We also want to be able to compare it to future generations of models.”

As part of the National Science Foundation’s Petascale Computing Resource Allocations (PRAC) program, the team is making a broad set of improvements to the Community Climate System Model. This massive computing code combines 3D ocean models, 3D atmospheric models, and 2D surface models that simulate things like vegetation, soil, rivers, and lakes. The Community Climate System Model has been used throughout the climate modeling community for years.

The trillion-dollar question

The improvements are needed because they will allow scientists to refine and improve the models, which are by necessity approximations. “These represent the best we can do, not the best we could do,” says Kinter.

Current models provide an excellent view of overall climate. Assessments by the Intergovernmental Panel on Climate Change were based on comparing many of these models, generated by scientists around the world. All had their differences, but all painted the same big picture that the Earth was changing and that things like carbon emissions were impacting that change. The insights from those model-based assessments yielded a Nobel Peace Prize for the panel in 2007.

Current models do not provide accurate predictions of the impact in smaller regions, however.

“Global temperature change of one degree is very different than a particular region changing by three degrees. We know that global precipitation will be reduced, but that tells us nothing about Indian monsoons,” observes Stan, who is principal investigator on the team’s PRAC award.

“People’s lives are affected,” she says.

“Children either are or aren’t going to be able to take over their parents’ wineries in Europe. Droughts either are or aren’t going to continue and worsen in western North America,” Kinter says. “Our models can’t currently tell the difference.

“The trillion-dollar question is how much precipitation and temperature will change and exactly where. We haven’t been able to do that at the subcontinent scale.”

Simulations on systems like Blue Waters will allow scientists to establish that climate modeling can in fact provide reliable predictions at the regional level. They’ll then use those models to develop global strategies for reducing the impact of climate change and more local strategies for living with the changes that do occur.

Heads in the clouds

Improvements to the Community Climate System Model will require better resolution. Current simulations wrap the globe in a grid of cubes 250 to 100 kilometers on each side. With Blue Waters, they hope to get that down to 20 kilometers or smaller on each side.

Inside these cubes, everything from global currents that run thousands of kilometers to the viscosity of centimeter-scale drops of water is simulated. This breadth can produce what are known as “rectified effects.” These effects amount to big discrepancies in the climate models that are spawned by the fact that the smallest features of those models aren’t being simulated in enough detail.

Rectified effects produce things like an El Niño cycle that you can set your watch by instead of an erratic climate cycle that isn’t the same size or intensity every time, as it is in the real world. On the flip side, they make for models that fail to capture the Madden-Julian oscillation, clusters of thunderstorms that plague the tropics of Indonesia.

Abolishing these rectified effects will require treating certain aspects of the models, like clouds and the small eddies that roil the ocean, in fundamentally new ways.

Take clouds, for instance. “You see a little white spot, but, believe it or not, they influence large-scale climate,” Stan says. “We need a very high-resolution model of these very small entities.”

Today, there’s no way to represent clouds at their native scale in global models. Instead, researchers develop and use a statistical representation of the overall impact that clouds have on the climate. “It doesn’t work as well as it should,” Kinter says.

Even with Blue Waters and its generation of supercomputers, they won’t be able to model individual clouds at the one-kilometer scale that they require. They will, however, apply a technique called “superparameterization.” This will allow them to model hundreds of clouds in a given grid region and aggregate the impact of those for that section of the model. “Now that’s extremely effective, and it has been shown to work much better than the old way,” Kinter says.

With that method in place on Blue Waters, the team hopes to run the first global cloud-resolving model for the United States and to integrate the results into global climate models, too.

“That model, we will call a very big victory,” Stan says.

This project was funded by the National Science Foundation.

Team members
Jim Kinter
Benjamin Kirtman
William Large
David Randall
Cristiana Stan

Disclaimer: Due to changes in website systems, we've adjusted archived content to fit the present-day site and the articles will not appear in their original published format. Formatting, header information, photographs and other illustrations are not available in archived articles.

Back to top