A direct bridge April 17, 2014 Share this page: Twitter Facebook LinkedIn Email by Barbara Jewett An algorithm developed by University of Arkansas engineers aids both computational scientists and experimentalists exploring the atomic scale structure of materials. The structure of a material influences every property of that material, says engineering professor Douglas Spearot. And it is the structural details at the atomic level that can have the greatest influence but which we often know the least. That’s why Spearot, who teaches materials science and mechanics in the Department of Mechanical Engineering at the University of Arkansas and also conducts research in UA’s Institute for Nanoscale Materials Science and Engineering, and PhD student Shawn Coleman explored new techniques to learn more about the atomic structure of materials in order to better understand a material’s properties. The results? A unique algorithm integrated with the LAMMPS simulation package that uses simulation data to produce a visualization of both X-ray diffraction line profiles and selected-area electron diffraction patterns from atomistic simulations that is directly comparable to what experimentalists would observe in a lab. Their work was first published in the journal Modeling and Simulation in Materials Science and Engineering, and more recently published in the March issue of JOM, the journal of The Minerals, Metals & Materials Society. Spearot says the true utility of simulation “is that we can often do things with a resolution that experiments can’t do, or we can study certain boundary conditions that experiments can’t do very easily.” He and Coleman are quick to point out that this success would not have been possible without the expertise of NCSA visualization expert Mark Van Moer and research scientist Sudhakar Pamidighantam, and the compute resources and additional expertise available to them through the Extreme Science and Engineering Discovery Environment (XSEDE). Bridging two worlds Two commonly used experimental techniques to study the structure of materials are X-ray diffraction and electron diffraction, explains Spearot. The experimentalist shines a beam of X-rays or a beam of electrons onto a sample. Because the sample has, in theory, a three-dimensional repeating periodic structure, when the sample interacts with the X-ray or electron beam a diffraction pattern is produced. From the details of that diffraction pattern, says Spearot, it is possible to identify the structure of the material, the lattice constants of the material, and averaged measures of the defect content within the material. “When people do simulations they have a whole different toolkit of ways they can analyze their simulation. And the problem is that the tools the simulation people have are not the same tools the experimentalists have,” says Spearot. “Most atomistic simulation tools are not compatible with what experimentalists have in their toolbox, making it very difficult for researchers to make a 1:1 validation of the properties you are modeling. So what we have done is develop an algorithm that produces a visualization of an electron diffraction pattern or an X-ray diffraction profile that is directly comparable to what experimentalists would observe. We developed a tool in the simulation world that can make a direct bridge to the experimental world.” Coleman and Spearot applied their tool to low-angle nickel symmetric tilt grain boundaries, nanocrystalline copper samples, and a complex heterogeneous alumina interface. But they say the tool is not specific to metals or to specific crystal structures, calling it truly generic. More importantly, they note, unlike other simulation algorithms their algorithm only requires a small bit of initial information in order to work. “All we need is to input the atom position and the species—what element is each atom in our simulation—and the boundary conditions of our simulation. We don’t have to give any more information than that. Whereas 30 years ago people had to specify the structure or expected structure of their system because the computational power at that time couldn’t do what we are doing now,” explains Spearot. Expert assistance Contemporary computational power combined with expert assistance allowed Coleman to develop the algorithm and pursue what he terms “ambitious” PhD goals; he will receive his degree this summer. His success was enabled through the XSEDE project. Funded by the National Science Foundation and led by NCSA, XSEDE goes beyond providing powerful hardware to offer the necessary assistance that makes supercomputers easier to use and helps more people use them. In addition to requesting supercomputing time, Coleman and Spearot also requested Extended Collaborative Support Services (ECSS) so they could be paired with XSEDE staff members who are experts in the areas where they needed some help. A major area where that help was needed was in developing the workflow to automate the simulation techniques. As one of the developers of the Computational Chemistry Grid (C-Grid), NCSA’s Pamidighantam is well versed in workflows. He worked with Coleman and NCSA visualization expert Van Moer to automate Coleman’s simulation and visualization techniques, allowing him to launch simulations from his desktop computer and receive the data and visualizations without further interaction. Coleman credits the workflow and it’s resulting higher throughput to the successful completion of his PhD. Once a simulation is launched, it runs on Stampede at the Texas Advanced Computing Center in Austin to do the atomistic simulations. After completing the simulations it automatically launches on Gordon at the San Diego Supercomputer Center to do the virtual diffraction calculations as well as the visualizations. “This has helped me complete everything I set out to do and learn as much as I can from the expertise provided by the ECSS team members,” says Coleman. Next steps A feature of the algorithm is that it is not limited to a very small number of cores, scaling out in accordance with the size of the problem to hundreds of processing cores. Currently, says Coleman, he has applied this algorithm to study ~15 million atoms and 30 million reciprocal lattice points. He hopes this increases as he and Spearot take advantage of additional ECSS expertise to improve the algorithm’s scalability. This additional expertise comes from two sources. Yang Wang at the Pittsburgh Supercomputing Center has been working to improve the parallelization of the algorithm and thus increasing the scalability. Luis Cueva-Parra, a professor of applied mathematics at Auburn University Montgomery and XSEDE Campus Champion Fellow, is the other expert. The Campus Champion Program provides a source of local and regional high-performance computing and cyberinfrastructure information. Spearot says Cueva-Parra visited the Arkansas campus for several days, working directly with Coleman and him to improve the scalability of their code. In addition to increased scalability the team is seeking experimental collaborators. They are talking with potential experimental collaborators who are capable of doing in situ diffraction measurements, to see how predictive their simulations can be when compared to experiments. The next step is to reach out to the research community and let the experimentalists know that this simulation tool exists. “Our tool is focused on the atomic scale or the nanoscale structure of materials,” says Spearot. “If we can learn more about that nanoscale structure, we can learn more about properties of materials at the nanoscale, and then hopefully that can build up into a better understanding of materials at the macroscale.” Disclaimer: Due to changes in website systems, we've adjusted archived content to fit the present-day site and the articles will not appear in their original published format. Formatting, header information, photographs and other illustrations are not available in archived articles. News Archive