So much detail | News | National Center for Supercomputing Applications at the University of Illinois
So much detail
07.09.13 - Permalink
by Elizabeth Murray
The power of Blue Waters is revolutionizing the field of space physics.
The sun is the key to life-sustaining functions here on Earth. It provides energy for photosynthesis and serves as a driving force behind climate and weather patterns, among other things. This star is both literally and figuratively the center of our solar system, so it should also come as no surprise that it affects our lives in ways we never expected.
“The Earth is embedded in the Sun’s extended atmosphere. As a result, the Earth and its technological systems are in constant threat from magnetic storms on the Sun. While most of these storms are not directed towards the Earth, the results can be devastating if a massive storm happens to be directed at the Earth,” says Homa Karimabadi, space physics group leader at the University of California, San Diego and chief scientist at SciberQuest, Inc. Karimabadi received one of the first Petascale Computing Resource Allocations from the National Science Foundation that enabled his research team to prepare their codes for extreme-scale supercomputers and to tap into the computing and data power of Blue Waters.
In the past these storms on the Sun rarely impacted us in any significantly detrimental way, most effects presenting themselves through aurorae. Now our day-to-day lives depend greatly on satellites, electronics, and power grids, and suddenly what’s happening on the surface of that tumultuous star begins to really matter.
Solar flares and space forecasts
Nearly everyone has heard of solar flares. Some of the more memorable representations—coming in the form of pictures showing huge flame-like explosions off the Sun’s surface and explosions—aren’t far from the truth. If you look at the surface of the Sun it is a turbulent and violent environment. The Sun consists of a hot, ionized gas, referred to as plasma, with an embedded magnetic field. The more the plasma roils, the more it moves the magnetic field around, causing extreme tension.
Karimabadi compares the building tension in this magnetic field to that of tension on a stretched rubber band. At some point the magnetic field lines snap, heating and accelerating the plasma in the process. This gives rise to solar flares, and coronal mass ejections (CME). He says CMEs spew out billions of tons of gas ejected from the Sun and hurled into space. He says a flare can easily span 10 times the Earth’s diameter in size and release energy “in the order of 160 million megatons of TNT equivalent.” With the release of CMEs into the solar wind—the medium between the Sun and the Earth—a phenomenon known as space weather occurs.
According to NASA, the most powerful solar flare of the year thus far erupted on April 11, 2013, causing a temporary radio blackout here on Earth. And that isn’t even the worst of the possibilities. It is estimated that a solar storm of similar magnitude to that of the 1859 Solar Superstorm would cause over $2 trillion in damage today. A strong storm can disable high-voltage transformers, knock satellites out of orbit, and cripple communications worldwide. The more notable of these occurrences happened on March 13, 1989, with the collapse of the entire Québec power grid. Karimabadi says the figures register in the billions of dollars when talking about losses in satellite technology that have been linked to space weather damage.
A solution? Forecasting space weather.
A given CME can take anywhere from one to five days to reach the Earth, providing sufficient time to take evasive action. However, during solar maxima, there can be over three solar storms per day.
“You can’t just shut off the power grids, the electronics, and the satellites every time there is a storm. We really need to be able to judge the severity of a given storm, its impact, and its location of impact,” notes Karimabadi. The challenge then becomes to develop accurate space weather forecasting models that can be put to use in real-time situations.
Thanks to something called the magnetic dipole field, our own planet has created the magnetosphere, serving as a shield to protect us from space weather effects. And for the most part, this line of defense works. Most of the energetic particles and radiation coming from the Sun are deflected and go around the Earth’s magnetosphere.
Unfortunately, a process called magnetic reconnection allows some solar wind to penetrate the planet’s protective shield. The research team is focusing on trying to simulate and understand the physics behind this magnetic reconnection, which occurs on electron scales but has global consequences.
“There are global magnetospheric codes that model the interaction of the solar wind with the Earth’s magnetosphere. The ultimate goal is to run such codes in real time based on measured properties of a given CME heading toward the Earth, and predict the geographical location and the severity of the impact. However, the current models lack certain details, an important one being the proper physics of magnetic reconnection, which is essential for developing accurate forecast models,” explains Karimabadi. Thus much of his team’s effort is going toward developing models of magnetic reconnection that can be inserted into the global codes.
The types of simulations that come with this kind of research are some of the most challenging in terms of the data and memory requirements. Petascale computing through their allocation on Blue Waters is making the team's simulation goals more and more of a reality with every run.
“We wanted to know if we changed the parameters, such as system size, would the physics change,” Karimabadi says, “and if it did, could we develop scaling laws to extrapolate the results to real systems in nature?”
Blue Waters made it possible for the team to do a scaling study by being able to push their simulations to the largest size possible on any supercomputer today. Karimabadi says they have discovered new regimes of reconnection and scaling relations with system size and parameters.
But it isn’t just about having access to Blue Waters, it is also about having access to the NCSA Blue Waters team to optimize their codes and get the most out of their time with the behemoth machine. NCSA senior research programmer Ryan Mokos facilitated a technical collaboration last year with Karimabadi’s research team, Cray Inc., and the San Diego Supercomputer Center to increase performance of their H3D code—a global code developed at UC San Diego by Karimabadi’s team that treats electrons as fluids and ions as kinetic particles.
Prior to petascale computing, almost all of the global simulations were based on fluid models. The petascale powerhouse has given the team options. Blue Waters is enabling global simulations that include important ion kinetic physics using H3D code and local simulations that resolve both ion and electron kinetic physics using the code VPIC.
This switch didn’t happen overnight, though. Kalyana Chadalavada, another NCSA senior research programmer, also worked closely with Cray, Karimabadi’s team, and a team at Los Alamos National Laboratory including Kevin Bowers and William Daughton, to improve VPIC—the kinetic code that models individual particles for the inclusion of electron physics—as a part of the sustained petascale performance (SPP) optimization effort.
Chadalavada implemented the use of more fused multiply-add (FMA) instructions, while Jim Kohn of Cray eliminated some redundant stores to temporary variables and reordered some independent SSE instructions for better resource utilization on the chip. Chadalavada expounds on the performance gain, noting they were “able to achieve an improvement of 12–18 percent by using more FMA instructions, as well as another 5 percent improvement by eliminating some redundancy and rearranging some parts of the code.”
“At scale, a significant portion of the run time is spent transferring particle data between processors. Changing the relevant communication routines by, for example, overlapping more communication with computation, could reduce the total simulation time considerably,” he says.
The research team is grateful for the opportunity to work on a system like Blue Waters, but they are just as grateful for the support they receive from NCSA; and from Burlen Loring at Lawrence Berkeley National Laboratory, who developed data analysis capabilities suited for large data sets.
“The time we have on this machine is really precious and it’s limited, so we really need to get the codes to run as efficiently as possible, and working with the Blue Waters team has been extremely helpful. For doing the massive runs that we do, it’s absolutely critical to have the points of contact we have,” notes Karimabadi.
The holy grail of space plasmas and space physics would be to figure out how to model magnetic reconnection—which we now know depends heavily on and is strongly affected by electron kinetic effects—in a global code where it is not possible to resolve electron kinetic effects. Capturing the essence of magnetic reconnection in this way is the key to precise prediction of space weather and Karimabadi’s team is starting to see that breakthrough form in their future.
In fact, their work with NCSA and Blue Waters has already produced a major revolution for their discipline says Karimabadi. In a fluid model, there aren’t any particles so there aren’t things bouncing off or deflecting. There is nothing to generate waves, so a lot of important information is lost.
“There have long been speculations about how the electron and ion kinetic effects affect details of the reconnection process and its global consequences. Many of the ideas can now be tested and new ideas and questions formulated to make further progress,” Karimabadi continues, “Now that we have been able to go beyond fluid models, which ignore details on small scales, we have uncovered new and unexpected effects and are finding ample evidence that physical processes occurring on small scales have global consequences.”
These simulations provide details that can be directly compared with spacecraft measurements.
“If you put the results from a global fluid simulation next to one from a global kinetic simulation, it’s almost like you have always had really poor eyesight and someone gave you glasses for the first time. You start to see a lot of details that were completely absent in the previous simulations,” explains Karimabadi. “You see so much structure, there is so much detail that it’s just amazing, and for the first time we can track the global consequences of such details.”
Blue Waters is supported by the National Science Foundation through awards ACI-0725070 and ACI-1238993.