A workhorse retires
09.17.10 - Permalink
After providing millions of compute hours over six and one half years of faithful service, NCSA's Mercury cluster retired at the end of March.
Described by many as "a true workhorse" of a machine, it was used by scientists studying everything from mesoscale thunderstorms to the most minuscule atoms. And reliable it was, up and running about 98 percent of the time. The few times it was down were often owing to circumstances beyond anyone's control.
NCSA's Dan Lapine was responsible for keeping the Mercury cluster operational from its first compute hour to its last. He recalls the time a few years ago when NCSA was working with the San Diego Supercomputer Center, Caltech, and the Southern California Earthquake Center through the TeraGrid program. Supercomputers were sharing data cross-country through a single filesystem, modeling earthquakes and their impact. The sites were connected by a 40 gigabit per second network connection, one of a very few like it in the world.
The only problem? Networks aren't the only things that travel cross-country. Trains do too.
"Twicenot oncetwice, we had train derailments in Colorado where the train tracks parallel the lines that do the data, and the train cut the cable. So we had our file systems crash because of a train derailment in Colorado. A really good example of how our supercomputing and its interruptions don't necessarily depend on us," says Lapine.
Mercury was among the first supercomputers based on the Intel Itanium chip. In fact, one of the test clusters that preceded Mercury had included Itanium chips with serial numbers 1 and 2.
"It was a bit of a risk. No one else was doing it at the time," says Lapine. "It allowed us to have more memory available for each machine and a faster clock speed. But a lot more performance every time the computer would do something. ...When we put the system together, we actually were the 15th fastest computer in the world."
But as technology advanced in the supercomputing world and faster systems arrived on the machine room floor, the goals for Mercury changed. "Over time, the need for Mercury to be the most leading edge became less and less and the need for consistency and stability became more and more," says Lapine. "So being able to keep the machine available 98 percent of the timeI'm quite happy with that."
More than 1,000 researchers had accounts on the system, and more than 2 million jobs were run. Jobs for researchers like Julio Facelli, a professor at the University of Utah who used Mercury to predict crystal structures for organic molecules that are frequently used in pharmaceuticals, fertilizers, and explosives. And for Georgia Tech's Marilyn Smith, who relied on Mercury to model the aerodynamic effects of wind turbines. And for the University of Illinois' Klaus Schulten who depended on Mercury for many of his projects, including researching DNA and gene expression.
These researchers are among the hundreds of Mercury users who conducted transformative research and published papers on the results. For instance, Gautam Ghosh of Northwestern University was one of Mercury's first users. He published nine papers in four years based on work conducted on Mercury. The University of Illinois' Roman Boulatov and his team published eight. In fact, hundreds of papers can be traced back to Mercury's compute power. Not a bad legacy for a revolutionary workhorse of a supercomputer.
=====BREAK===== Searching for a pollution solution
Searching for a pollution solution
Coal-burning power plants spew toxic mercury into the atmosphere, but a researcher at the University of Arizona aims to better understand mercury reactions in order to develop effective emission controls.
by Trish Barker
The modern age is marked by a nearly insatiable hunger for electricity, and more than half of that power is generated by burning coal. But that electricity is doing more than lighting our homes and driving our computers; its production is also generating toxic mercury. Coal-burning power plants are the largest source of human-generated mercury emissions in the United States.
Scrubbing the hazardous mercury from power plant emissions is not easy. In its elemental form, mercury is not soluble in water and is not readily adsorbed by solids, traits that allow it to elude current techniques for trapping dangerous flue emissions. Recent research, however, has demonstrated that CDEM (a product derived from recycled pulp and paper mill waste) can capture 100 percent of the mercury in flue gases if the mercury is oxidized. Elemental mercury can react with the other components of flue gases, such as chlorine, to form oxidized compounds.
Putting this knowledge to use is complicated by the fact that the mechanisms that change elemental mercury into various oxidized forms are largely unknown. Paul Blowers, an assistant professor of chemical and environmental engineering at the University of Arizona, uses NCSA's two-teraflop IBM p690 cluster to study these reactions in the hopes that a better understanding of them will enable the design of improved strategies for capturing mercury to protect our air, water, and food supplies, and our health.
From power plant to dinner plate
The concentration of mercury in power plant emissions is low, but because of our ceaseless demand for electricity the total amount of mercury released from power plants mounts. The total is now about 48 tons per year in the United States alone.
A potential energy surface for Hg + HCl → HgCl + H calculated at the QCISD level with the 1992 basis set.
Power plants are equipped with a variety of devices to control other dangerous emissionsfrom fabric filters to catch particulate matter to wet and dry scrubbers that absorb sulfur dioxide and other chemicals. "None of the environmental remediation techniques that are typically used work for mercury," Blowers explained.
This mercury pollution can linger in the atmosphere for up to two years before precipitation pulls it down into our rivers, lakes, and oceans. Microorganisms convert some of the elemental mercury into highly toxic methylmercury, the form that is most readily absorbed by living things. Small organisms absorb the methylmercury and are then eaten by animals higher in the food chain. Because mercury can never be eliminated from the body, large animalsparticularly large predatory fishretain all of the mercury contained in a lifetime of meals in a process called bioaccumulation.
At the pinnacle of this food chain, humans sitting down to dine on swordfish, shark, tuna, or salmon can also unwittingly ingest a large dose of mercury. Mercury can damage the nervous system, liver, and kidneys. Developing fetusus are particularly vulnerable, and studies have found that eight percent of American women of childbearing age have unsafe levels of mercury in their blood.
Even a small amount of mercury escaping from a power plant can be hazardous. According to the National Wildlife Federation, as little as 1/70th of a teaspoon of mercury can contaminate a 25-acre lake, rendering all of its fish unsafe for human consumption.
Studying the smokestack
Because of the dangers, the Environmental Protection Agency is proposing the first ever federal regulations on mercury emissions from power plants. The EPA plan calls for a 70 percent reduction in mercury emissions by 2018. Reaching that goal will mean implementing new techniques in the smokestack.
With that in mind, Blowers began to question how mercury interacts with the other components of flue gases, including chlorine, ozone, oxygen, and even soot particles. "What can happen to mercury in the smokestack? Can we maybe drive it into a form that's water soluble?" he wondered.
The questions aren't simple ones to answer. "We don't understand how mercury reacts in the gas phase," Blowers said. "And if you're trying to figure out how fast reactions will happen and what products will be produced, we just can't do that experimentally."
Finding a quantum method
Blowers instead relies on quantum chemistry calculations to try to build a model of how mercury reacts in the superheated environment of a power plant's smokestack.
Quantum chemistry examines the world at the atomic and subatomic level. Given just a system's elements and molecules as a starting point and working with the basic laws of quantum mechanics, it is possible to predict molecular structures, heats of formation, vibrational frequencies, and activation energies. This information then can be used to calculate the reaction's rate constant, reaction rate, and kinetics. However, even a slight error in a predicted activation energy can lead to reaction rates that are off by several orders of magnitude.
At the quantum scale, there are many complex forces and interactions that must be taken into account. All quantum chemistry methods seek a solution to the Schrödinger wave equation for the given molecular system, but even with today's powerful supercomputers the equation is a challenge to solve for a real system. Therefore, scientists use various levels of theory and approximations (such as treating an atom's inner electrons as an averaged potential rather than as individual particles) to simplify the quantum model and reduce the computational costs while still returning useful results.
Blowers' first step, therefore, was to determine which computational method would generate the most accurate results, so he started his research with a reaction for which known experimental data could be compared to theoretically derived rate constants. He calculated rate constants for the reaction in which chlorine atoms oxidize elemental mercury using seven combinations of methods and approximations.
He found that the most accurate resultswithin an order of magnitudewere generated by the combination of the QCISD method and a basis set developed in 1992. The QCISD (quadratic configuration interaction with single and double excitations) method optimizes the structural geometries of a system at a higher level than other methods. The 1992 basis set, which is a set of mathematical functions that are combined to approximate the wavefunctions for electrons, contains more valence electron basis functions than other basis sets.
Combining both makes for a computationally intensive method, and Blowers has used 45,000 hours of compute time on NCSA's systems over the past four and a half years.
Charting new territory
With the methodology demonstrated (and an article describing it recently published in the journal Fuel Processing Technology), Blowers has gone on to predict rate constants for reactions for which there is no experimental data.
"We've measured or predicted rates for some reactions that no one has ever measured or predicted before," he said.
An article published in Environmental Science and Technology reports on the use of quantum chemistry methods to investigate the reaction in which hydrogen chloride oxidizes mercury, and another study used the technique to estimate the heat of formation for HgO. In the latter case, the heat of formation found by Blowers accorded well with other high-level quantum chemical estimations but was much higher than the experimental values frequently used by other researchers.
"We've given modelers what we think are more accurate rates to put in their models," he said.
Blowers next plans to examine mercury's reactions with O2 and sulfur, and is considering the other aspects of the flue environment.
"One of the things that intrigues me is, what is the soot particle doing to mercury? Experimentally, that is a black hole," he said. It is possible that the smokestack interaction between particulate matter and mercury could provide an opportunity for improved emissions control. "We're still open to the idea that water scrubbing won't be the solution," he said.
As the reactions occurring in flue gases are better understood, scientists will be able to experiment with techniques to steer the reactions in a way that enables the capture of mercury and safeguards the environment.
This research is funded by the U.S. Environmental Protection Agency.