Skip to main content

Preparing for the big one


by J. William Bell

The northern reaches of the San Andreas Fault have seen their share of major earthquakes in the last century. The 1906 San Francisco killed more than 3,000 people, and a 1989 quake near Santa Cruz postponed the World Series. The other end of the fault near Los Angeles, meanwhile, hasn’t seen a major earthquake since 1680. But there is a high probability of a rupture over the next two decades.

A team of more than 30 earthquake scientists, computer scientists, and other specialists are very interested in that next earthquake—what it might look like, what sort of damage it might cause, and what might be done to mitigate the damage. Led by the Southern California Earthquake Center (SCEC), they plan to use the Blue Waters sustained-petascale supercomputer at NCSA to model it.

“SCEC researchers are developing the ability to conduct end-to-end, rupture-to-rafters simulations that will extend our understanding of seismic hazards,” says Thomas Jordan, SCEC’s director. “The integration of these research applications represents an unprecedented opportunity to advance our ability to characterize seismic hazard and risk.”

From today’s biggest to tomorrow’s

SCEC’s Community Modeling Environment (CME) allows the researchers to model fault ruptures and seismic wave propagation. Parts of the tool can run across all the available processors in some of the world’s largest supercomputers—60,000 cores on the Texas Advanced Computing Center’s Ranger, 96,000 cores on the National Institute for Computational Sciences’ Kraken, and 130,000 cores on Argonne National Lab’s Intrepid.

Even at that scale, they don’t yet reveal some important aspects of earthquake behavior. Current simulations, for example, only describe earthquakes with a maximum frequency of 1.0 hertz, which provides spectral acceleration information of one to two seconds.

“Frequencies in this range are of interest for medium-height buildings to high-rise—10 to 30 stories. However, the number of areas with such tall buildings is limited,” says Yifeng Cui, a computational scientist at the San Diego Supercomputer Center who is working on the project. But the impact on shorter buildings, including many homes, must also be understood.

With the Blue Waters supercomputer, the team expects to get to two hertz. That will be on simulations that model an 8.1 magnitude earthquake, which is “a worst-case scenario on the San Andreas Fault,” according to Cui. Those simulations will look at a section of the earth 800-by-400 kilometers and 100 kilometers deep, covering it with more than two trillion mesh points, or one every 25 meters. That makes for a simulation more than 256 times more computationally taxing than their current work.

“There’s huge demand for high frequency. The engineering interest is all the way up to 10 hertz. If we want to simulate on the 10 hertz scale, not even Blue Waters will be big enough,” Cui says. The two hertz range will “provide useful data of engineering interest and greatly expand the usefulness of the simulations to the engineering community,” nonetheless.

It will also provide the code base that will allow researchers to ultimately scale up to those incredibly high frequencies. Preparation is key—whether you’re facing the big one in the real world or modeling it on a supercomputer.

To increase the frequency that earthquake engineers can model, the team applied for—and won—a Petascale Computing Resource Allocation (PRAC). Through these awards from the National Science Foundation, the Blue Waters team is working with almost 20 teams around the country to prepare their codes to run on Blue Waters and other computing systems like it. The multiyear collaborations include help porting and re-engineering existing applications.

The PRAC team led by SCEC will focus on a set of three seismic and engineering modeling codes. The codes model fault rupture, propagate seismic energy through a detailed structural model of Southern California, predict ground motion, and model buildings responses to earthquakes. Work on interaction between the soil, building foundations, and large collections of buildings, bridges, and other infrastructure—as well as the interactions among structures in densely built areas—will be led by a team from Carnegie-Mellon University.

The goal is to combine the use of these tools to understand building damage likely to result from strong earthquakes and to run them on petascale supercomputers. As a collaborative, interdisciplinary research group, SCEC’s computational science group specializes in designing and performing large-scale simulations that involve multiple disciplinary groups such as the ground motion and building modelers involved in this PRAC research.

‘We know there is a due date’

SCEC’s Information Technology Architect, Philip Maechling says, “When the SCEC science community identifies an important geophysical simulation that, due to its scale or complexity, exceeds the capabilities of individual research groups, SCEC’s Community Modeling Environment collaboration may take on the challenge. Due to the broad range of scientific expertise within this group, and our close collaboration with computer scientists and HPC experts, the SCEC CME can work on some of the largest and most complex problems in our field such as the 2-hertz wall-to-wall scenario earthquake simulation we are preparing to run on Blue Waters.”

“At this large scale and this level of complexity, you require a lot of expertise from different areas,” Cui says. “In particular we require an in-depth understanding of the computational and storage hardware on Blue Waters as well as information on optimal software designs on this new architecture. Without support from NCSA, we would not be able to produce such large simulations. Our collaboration with the NCSA PRAC team enables SCEC to focus most of our efforts on the geophysical aspects of our research and reduces the time we must spend on scaling-up our codes.”

SCEC’s efforts to achieve sustained petascale computing on Blue Waters are expected to contribute to a broad range of SCEC’s seismological research over the next few years. “Code improvements developed in support our largest simulations, such as this 2Hz simulation on Blue Waters, are rapidly integrated into more common, less demanding, seismic researchcalculations,” says Maechling. “Our Blue Waters work represents a technological driver for SCEC that we believe will lead to significant improvements in a broad range of existing SCEC seismological research.”

“We know there is a due date for our research. We just don’t know when that due date is.”

This work is funded by the National Science Foundation

Team members
Thomas H. Jordan – USC
Jacobo Bielak – CMU
Kim Olsen – SDSU
Yifeng Cui – SDSC
Steve Day – SDSU
Amit Chourasia – SDSC
Philip Maechling – USC
Scott Callaghan – USC
John Urbanic – PSC
Geoff Ely – USC
Po Chen – U of Wyoming
Dongju Choi – SDSC
Zizhong Chen – Colorado School of Mines
Daniel Roten – SDSU

Disclaimer: Due to changes in website systems, we've adjusted archived content to fit the present-day site and the articles will not appear in their original published format. Formatting, header information, photographs and other illustrations are not available in archived articles.

Back to top