Skip to main content

Really big problems


by Trish Barker

“We’re driven by solving really big problems,” says Greg Voth, the Haig P. Papazian Distinguished Service Professor of Chemistry at the University of Chicago. His research team uses multiscale computational simulation to study complex biomolecular, condensed phase, and novel materials systems. But these systems and processes are so complex that they are beyond the reach of molecular dynamics simulations that model every atom—even the largest simulations, containing tens of millions of atoms, can show only a fraction of a process in a living cell, for example. To simulate processes over greater time and length scales would require a new method.

Voth decided to employ coarse-grained (CG) models in which atoms are grouped, rather than being considered individually. While coarse-grained studies are often used to quickly analyze systems on limited computing resources, Voth’s idea was to instead scale the coarse-grained simulations to run on the largest computing resource available: NCSA’s petascale Blue Waters.

“What we did was decide not only to do the coarse-grained but also to combine it with extreme-scale computing and push the limits of what people can do,” Voth says.

To accomplish this, postdoctoral researcher John Grime developed an entirely new coarse-grained molecular dynamics code specifically designed for extreme-scale computing systems that harness the power of hundreds of thousands of processors.

“We weren’t following any existing prescription, we weren’t copying something. There was no textbook answer,” says Grime. “It was liberating.”

The very highly coarse-grained models result in a heterogeneous distribution of mass. Grime hit upon the idea of adopting a method (Hilbert space-filling curves) used in astrophysics simulations, which also must contend with such non-uniformity.

Grime’s software development work was supported by funding from the NCSA/Illinois Enhanced Intellectual Services for Petascale Performance (NEIS-P2) program, a supplement to the Blue Waters project. Voth says this support was “absolutely critical.”

“So often these computer allocations come with the assumption that you can take the funding you already have and make the people who are supposed to be doing science into software engineers,” he says. In this case “they gave us real money to pay brilliant people like John to invest in this effort. I think that should become more of a model for these supercomputer allocations.”

Also critical were the efforts of the Blue Waters consulting staff, particularly the work of senior research programmer Robert Brunner.

“Robert was pretty crucial in getting answers to technical questions. He was invaluable,” says Grime. “The big picture algorithmic stuff doesn’t come to anything unless you know the technical details of the hardware you’re running on.”

The team performed test simulations on Blue Waters: 1,000 timesteps of a lipid bilayer membrane accompanied by two bilayer membrane vesicles. This type of system would be relevant to, for example, modeling virus particles of HIV in the vicinity of a cell membrane.

They achieved good scaling to several hundred thousand processors. The results were reported in the Journal of Chemical Theory and Computation in November 2013. In that article, Voth and Grime wrote that “simulations indicate that our approach can offer significant advantages over conventional MD techniques, and should enable new classes of CG-MD systems to be investigated.” While more detailed atomistic molecular dynamics simulations can stretch to include tens of millions of atoms over tens of nanoseconds, the new coarse-grained method could be used to simulate processes that take minutes.

Voth’s team is now using Blue Waters to apply this coarse-grained method to complex biomolecular processes, including HIV replication and the assembly of the HIV capsid. And they foresee many other applications down the road.

“The idea from the outset was to address processes that people didn’t even conceive of before,” Voth says.

This material is based upon work supported by the National Science Foundation through the Center for Multiscale Theory and Simulation (NSF grant CHE-1136709), the Blue Waters sustained-petascale computing project (OCI 0725070), and the “Petascale Multiscale Simulations of Biomolecular Systems” NSF PRAC allocation (OCI 1036184).

For more information

http://vothgroup.uchicago.edu/

Citation

J. M. A. Grime and G. A. Voth, “Highly Scalable and Memory Efficient Ultra-Course-Grained Molecular Dynamics Simulations”, J. Chem. Theor. Comp. DOI: 10.1021/ct400727q

Disclaimer: Due to changes in website systems, we've adjusted archived content to fit the present-day site and the articles will not appear in their original published format. Formatting, header information, photographs and other illustrations are not available in archived articles.

Back to top