Hunt for the Supertwister | National Center for Supercomputing Applications at the University of Illinois
Hunt for the Supertwister
11.08.04 - Permalink
by Trish Barker
In a typical year, 1,200 tornadoes cause 70 fatalities and 1,500 injuries nationwide. Most of the damage, deaths, and injury are due to a very small percentage of these tornadoes, the so-called “supertwisters” whose winds of more than 200 miles per hour put them at the extreme end of the Fujita Scale of Tornado Intensity (the so-called F4 and F5 tornadoes). On average, there are 12 or 13 of these tornadoes in the United States each year.
Ideally, forecasters would be able to provide enough warning that people could protect themselves from these killer storms. While they have successfully identified atmospheric conditions that are favorable for supercell formation, accurately predicting which storms will produce tornadoes and at what time is a feat that continues to challenge forecasters. Researchers from the University of Illinois at Urbana-Champaign collaborated with visualization experts at NCSA in an effort to shed light on how the most violent tornadoes form and to create animations that reveal the inner behavior of tornado producing storms.
Their work was showcased this March in an episode of the PBS TV series NOVA called "Hunt for the Supertwister."
A storm is born
Scientists know that the strongest tornadoes are generated by a particular type of rotating thunderstorm called a supercell. The swirling winds of a supercell can produce tornadoes. But not all supercells lead to tornadoes, and not all tornadoes become supertwisters. In fact, only about 20 percent to 25 percent of supercells produce tornadoes. Why some storms spawn tornadoes while others don't—and why some tornadoes become extraordinarily strong supertwisters—is not yet well understood.
Supercells form in an unstable and adequately deep atmospheric layer that has sufficient moisture and significant change in the horizontal wind speed. An environment that favors the formation of tornadoes also requires high relative surface humidity, considerable low-level horizontal wind, and steep low-level lapse rates (meaning the temperature drops rapidly at greater atmospheric heights).
In an effort to pinpoint what triggers tornadoes, researchers—including NCSA research scientist Robert Wilhelmson—create computer simulations of evolving storms. Just as physicians use X-rays and CAT scans to diagnose disease, these storm researchers use simulations and visualization to analyze tornado formation.
"The big problem in storm science is that with the instrumentation we have we can't sense all the things that we need to know," explains Lou Wicker, a scientist at the National Severe Storms Laboratory who frequently collaborates with Wilhelmson. "From the field, we can't figure out completely what's going on, but we think the computer model is a reasonable approximation of what's going on, and with the model we can capture the entire story."
Wicker developed a model called NCOMMAS (NSSL Collaborative Model for Multiscale Atmospheric Simulation) to computationally simulate thunderstorms and their associated tornadoes. NCOMMAS is based upon an earlier model developed by Wilhelmson.
The simulation begins with data describing the pre-tornado weather conditions—wind speed, atmospheric pressure, humidity, etc.—at discrete points separated by distances ranging from 20 meters to three kilometers. Starting with these initial variables, partial differential equations that describe changes in the atmospheric flow are solved. The numerical solution of these equations proceeds in small time intervals for two to three storm hours as the supercell forms and produces a tornado. A virtual storm is born.
Simulating the supertwister
In the summer of 2003, the 200 mph winds of a supertwister ripped through tiny Manchester, South Dakota. The tornado's path of destruction was caught by the HDTV crew of Tom Lucas, the producer of the "Hunt for the Supertwisters" episode of NOVA. Knowing that weather research combines both daredevil storm chasing and computational simulation, he approached NCSA's Wilhelmson and Donna Cox, leader of NCSA's experimental technologies division, about modeling and visualizing that storm.
Researchers in Wilhelmson's convective modeling group got to work. Starting with the recorded conditions near Manchester, the simulation followed the erupting thunderstorm and resulting powerful tornado as it evolved in a 100 x 100 x 25 kilometer domain. A number of simulations were made using 250 meter and 100 meter horizontal resolution in the active storm region. The result of these simulations was the first ever simulation of a long-track tornado, defined as one that spends 40 to 60 minutes on the ground with a pressure drop of at least 50 millibars.
“The simulation of long-track tornadoes has remained elusive for almost a decade," Wilhelmson says, "and these exciting simulations have paved the way toward understanding the atmospheric conditions that lead to their occurrence.”
The visualizations included the in the NOVA special were made from a simulation performed on NCSA's IBM p690 computing cluster in November 2003. The simulation portrayed the development of a supercell and subsequent tornado about two and a half hours of "storm time." It was accomplished using 16 processors for approximately eight days.
The simulation produced 650 billion bytes of data consisting of snapshots of the evolving storm every second during the tornadic storm phase. These snapshots include wind, temperature, pressure, humidity, turbulence, water, and ice values on a three-dimensional spatial lattice of grid points within the solution domain.
Artfully interpreting data
NCSA's visualization team—Robert Patterson, Stuart Levy, Matt Hall, Alex Betts, Lorne Leonard, and team director Donna Cox—translated the data into a dynamic, high-definition animated visualization of the tornado's birth and growth.
Levy was the first member of the visualization team to work with the raw data from the simulation, computing the trajectories followed by tracer particles to reveal the twister's swirling winds. For the NOVA animations, simple glyphs such as balls and streamtubes were used to represent various aspects of the storm, with variations in color conveying additional information. Cones tilt and sway to show wind speed and direction at ground level, while balls and tubes of varying colors indicate the tornado's pressure and rotation rate.
Hall then worked to develop multiple isosurfaces, the transparent grey-blue clouds that represent the storm cloud, as well as the tilting cones. At each stage, Betts Developed Maya plugins and scripts to read and control the rendering of the data.
Finally, Patterson tackled the integration and choreography of the visualization, using the Maya software to make rendering choices and to focus on the most significant data and events. Among other daunting tasks, Patterson, in consultation with the storm team, had to edit the thousands of computed trajectories—which together look like a plate of angel hair pasta—down to the few most meaningful trajectories in order to make the data visualization accessible and useful for scientists.
Far from being a unidirectional assembly line, Cox says the visualization process is actually "a very human-intensive, iterative process," in which the members of the visualization team frequently consult with one another and with the storm team. At each stage of the process, human intelligence and collaboration are required to make decisions about what data are most descriptive and how best to draw meaning from the data.
"This has been a very hard-working, collaborative renaissance team," Cox says.
Gaining fresh insights
The visualizations are like "a three-dimensional virtual storm chase," says University of Illinois atmospheric scientist Matthew Gilmore. Using visualizations, events that occur with blinding speed in the field can be slowed to a crawl for close study. The data can be interrogated in many ways, allowing researchers to look at cross-sections, employ different points of view, and zoom in on small-scale effects.
Because of the vast and complex spatial and time datasets involved, the use of interactive visualization tools provides a vital advantage as Wilhelmson's team continues to investigate the mechanisms for tornado formation. In particular, the researchers are examining vortex generation, stretching, and convergence through comparison with theories for tornado formation and through in-depth trajectory analysis.
This project was supported by Intel Corporation, the National Science Foundation, NCSA, and NOAA's National Severe Storms Laboratory.
For further information:
University of Illinois at Urbana-Champaign
National Severe Storms Laboratory
National Center for Supercomputing Applications