Skip to main content

Black hole research featuring simulations from the Blue Waters supercomputer published in Nature


New research based on simulations using the Blue Waters supercomputer at the National Center for Supercomputing Applications (NCSA) at the University of Illinois, reveals that when galaxies assemble extremely rapidly—and sometimes violently—that can lead to the formation of very massive black holes. In these rare galaxies, normal star formation is disrupted and black hole formation takes over.

“We on the Blue Waters Project are very excited about this accomplishment and very pleased that Blue Waters, with its unique capabilities once again enabled science that was not feasible on any other system,” said Bill Kramer, the Blue Waters Principal Investigator and Director. “We look forward to helping continue these great research efforts.”

This new study, published in Nature and supported by funding from the National Science Foundation, the European Union and NASA, is led by researchers from Georgia Institute of Technology, Dublin City University, Michigan State University, the University of California at San Diego, the San Diego Supercomputer Center and IBM. Their research finds that massive black holes form in dense starless regions that are growing rapidly, turning upside down the long-accepted belief that massive black hole formation was limited to regions bombarded by the powerful radiation of nearby galaxies. Conclusions of the study, also finds that massive black holes are much more common in the universe than previously thought.

The research was based on the Renaissance Simulation suite, a 70-terabyte data set created on the Blue Waters supercomputer, and visualized for documentaries and the World Wide Web by NCSA’s Advanced Visualization Laboratory between 2011 and 2014 to help scientists understand how the universe evolved during its early years. “The formation of massive black holes is a rarity in the universe, requiring a simulation of a large cosmological volume while resolving the smallest scales around the black hole,” said John Wise, an associate professor in the Center for Relativistic Astrophysics at Georgia Tech and the paper’s corresponding author, “The tremendous capabilities of Blue Waters enabled us to run a simulation with a large dynamical range while probing thousands of galaxies. Our discovery would not have been possible without Blue Waters.”

To learn more about specific regions where massive black holes were likely to develop, the researchers examined the simulation data and found ten specific dark matter halos that should have formed stars given their masses but only contained a dense gas cloud.

“Blue Waters was crucial to the success of this project. At the time we ran the simulations, Blue Waters was the only fully open research supercomputer in the U.S. that had the capability to execute these calculations—it has large memory and a fast interconnect, and also an extremely fast file system. All of those things are critical to huge, multi-physics simulations like the Renaissance Simulations, and were very important to executing the simulation and also in our analysis of the resulting huge volumes of data,” said Brian O’Shea, a professor at Michigan State University and Principal Investigator of the Blue Waters allocation used for the calculations of this work.

The Renaissance Simulations are the most comprehensive simulations of the earliest stages of the gravitational assembly of the pristine gas composed of hydrogen and helium and cold dark matter leading to the formation of the first stars and galaxies. They use a technique known as adaptive mesh refinement to zoom in on dense gas clumps forming stars or black holes. In addition, they cover a large enough region of the early universe to form thousands of objects—a requirement if one is interested in rare objects, as is the case here.

The improved resolution of the simulation done for two candidate regions allowed the scientists to see turbulence and the inflow of gas and clumps of matter forming as the black hole precursors began to condense and spin. Their growth rate was dramatic.

“Astronomers observe supermassive black holes that have grown to a billion solar masses in 800 million years,” Wise said. “Doing that required an intense convergence of mass in that region. You would expect that in regions where galaxies were forming at very early times.”

Another aspect of the research is that the halos that give birth to black holes may be more common than previously believed.

“An exciting component of this work is the discovery that these types of halos, though rare, may be common enough,” said O’Shea, “we predict that this scenario would happen enough to be the origin of the most massive black holes that are observed, both early in the universe and in galaxies at the present day.”

Future work with these simulations will look at the lifecycle of these massive black hole formation galaxies, studying the formation, growth and evolution of the first massive black holes across time.

For these new answers, the research team—and others—may return to the simulations.

“The Renaissance Simulations are sufficiently rich that other discoveries can be made using data already computed,” said Mike Norman, Director of the San Diego Supercomputing Center (SDSC). “For this reason, we have created a public archive at SDSC containing called the Renaissance Simulations Laboratory (RSL) where others can pursue questions of their own.” NCSA’s Matt Turk and Kacper Kowalik are co-investigators along with Mike Norman and Britton Smith, also from SDSC for the Renaissance Simulations Laboratory.

Kowalik is the architect of this lab that runs in collaboration between SDSC and NCSA. NCSA developed the software, while SDSC provided the hardware, in this case, a server. The software, however, makes the simulations easier for anyone to access and run. “The Renaissance Simulation data is large, and the RSL allows you to work with that data on site, without having access to a supercomputer, you could access it by just using a laptop,” said Kowalik, “it’s very easy to access and analyze, and allows you to verify results by accessing this data.”

Turk worked on this joint project to develop softwares to help more people learn more about the data, and by having interactive tutorials, that were developed by NCSA. “I would like to see people explore the data in new discovery space and generate new, or utilize existing data products,” Turk said, “there are different ways that you may want to interpret that day to put into models and there are new questions that could be asked.”

This research was supported by the National Science Foundation through grants PHY-1430152, AST-1514700, AST-161433 and OAC-1835213, by NASA grants NNX12AC98G, 147 NNX15AP39G, and NNX17AG23G, and by Hubble theory grants HST-AR-13261.01, HST-AR-14315.001, and HST-AR-14326. This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 699941 (Marie Sklodowska-Curie Actions – “SmartStars”). The simulation was performed on the Blue Waters supercomputer operated by the National Center for Supercomputing Applications (NCSA) with PRAC allocation support by the NSF (awards ACI-0832662, ACI-1238993 and ACI-1514580). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the sponsor organizations.

About Blue Waters

The Blue Waters petascale supercomputer is one of the most powerful supercomputers in the world, and is the fastest sustained supercomputer on a university campus. Blue Waters uses hundreds of thousands of computational cores to achieve peak performance of more than 13 quadrillion calculations per second. Blue Waters has more memory and faster data storage than any other open system in the world. Scientists and engineers across the country use the computing and data power of Blue Waters to tackle a wide range of challenges. Recent advances that were not possible without these resources include computationally designing the first set of antibody prototypes to detect the Ebola virus, simulating the HIV capsid, visualizing the formation of the first galaxies and exploding stars, and understanding how the layout of a city can impact supercell thunderstorms.

About NCSA

The National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign provides supercomputing and advanced digital resources for the nation’s science enterprise. At NCSA, University of Illinois faculty, staff, students, and collaborators from around the globe use advanced digital resources to address research grand challenges for the benefit of science and society. NCSA has been advancing one third of the Fortune 50® for more than 30 years by bringing industry, researchers, and students together to solve grand challenges at rapid speed and scale.

About the NSF

The National Science Foundation (NSF) is an independent federal agency that supports fundamental research and education across all fields of science and engineering. In fiscal year (FY) 2017, its budget is $7.5 billion. NSF funds reach all 50 states through grants to nearly 2,000 colleges, universities and other institutions. Each year, NSF receives more than 48,000 competitive proposals for funding and makes about 12,000 new funding awards. For more information, visit www.nsf.gov.

Disclaimer: Due to changes in website systems, we've adjusted archived content to fit the present-day site and the articles will not appear in their original published format. Formatting, header information, photographs and other illustrations are not available in archived articles.

Back to top