NCSA Reflects on 35 Years as a Supercomputing Powerhouse February 2, 2022 Profiles AstrophysicsBlue WatersDeltaHPC OperationsIndustryInstitutional PartnershipsIntegrated CyberinfrastructureSoftware and ApplicationsVisualizationXSEDE Share this page: Twitter Facebook LinkedIn Email By Dina Meek 1986 was not an easy year. It started with the explosion of the U.S. space shuttle, Challenger, just 73 seconds after launch. Later, another explosion, this time at the Chernobyl nuclear power station in the former Soviet Union, resulted in the worst nuclear power disaster in human history. But 1986 did have some bright spots. IBM introduced the world’s first laptop computer. Author and humanitarian Elie Wiesel won the Nobel Peace Prize. And the National Science Foundation established the National Center for Supercomputing Applications as part of its Supercomputer Centers Program. Three years earlier in 1983, the famine of supercomputing power for U.S. academic researchers inspired University of Illinois Urbana-Champaign astrophysicist Larry Smarr and seven colleagues to submit an unsolicited proposal to the National Science Foundation, asking for funding for a supercomputer center at UIUC. (To this point, Smarr did his astrophysical computations at Lawrence Livermore National Lab or by traveling to Germany.) In 1984 NSF established its Office of Advanced Scientific Computing which, after soliciting proposals, funded five supercomputing centers across the country, including NCSA. Smarr was named its first director with a mission to help enable interdisciplinary research and propel science toward its next discovery. 1986–1995: Laying the Foundation Larry Smarr, NCSA Founding Director In its first decade, Smarr led the effort to create and establish NCSA to support computationally driven science, leveraging a diverse set of emerging supercomputer architectures. One of the first tasks was to simply get people connected to the machines over the fledgling internet. NCSA’s response was to form its Software Development Group (SDG), which created widely used software, starting with NCSA Telnet. Telnet enabled PC and Mac users to “hide the Cray,” referring to the hardware manufacturer, making NCSA’s supercomputer appear to be just another window on their local machine. The “number explosion” that supercomputing brought on, the sheer amounts of data it created, meant that ways of interpreting the data had to advance as well. As former NCSA Chief Science Officer Bob Wilhelmson remembered in an earlier interview, “I would create pictures on paper from the data, and I would literally, using masking tape, place them on a wall to get an idea of how things changed over time. Because at that time, you could only look at a single piece of information.” But with the advance of supercomputers, animations could be created on computer graphics workstations. Soon Wilhelmson’s supercomputer thunderstorm visualizations became an icon for what would be termed ‘scientific visualization,’ which, Smarr says, “had a major influence on two decades of computer graphics used in movies.” In fact, when NCSA hired Donna Cox, her legendary visualization work led to an academy award nomination and paved the way for women in Science, Technology, Engineering, Arts and Mathematics. Supercomputing power didn’t just benefit academics, not to mention Hollywood. In 1987, NCSA established its Industrial Program (now known as the Industry Partners Program), which helped bridge the gap between supercomputing and the corporate world. Pharmaceutical giant Eli Lilly and Company was quick to take advantage, using high-performance computing to accelerate discovery of new drugs. A diverse set of other companies, including Caterpillar, JP Morgan, Kodak, Motorola, Sears and Shell Oil would soon follow. Last but certainly not least, during its first decade, NCSA was critical in helping build the graphical web browser. While browsers weren’t new, NCSA Mosaic, written by Marc Andreessen and Eric Bina, working in NCSA’s Software Development Group, brought the experience to life with integrated pictures, movies and other elements, making it much easier to develop and interact with online content. It was also famously user-friendly. A GQ magazine article at the time said, “The initial version of Mosaic drew on the innovations of earlier browsers, which already included many of the features aimed at making the software easy and appealing for non-geeks, features that would later become staples of the genre, such as icon buttons (back, forward, home), bookmarks…and a variety of attractive fonts and typefaces.” “NCSA Mosaic and its development team led directly to today’s World Wide Web browsers and servers,” Smarr says. In fact, NCSA also developed the standard web server software, HTTPd, that would ultimately lead to the popular Apache webserver. Mosaic was so popular that an experimental supercomputer – a cluster of machines built to appear as one – had to be designed just to help the servers handle all the remote-user interactions. Smarr’s most lasting memory from that first groundbreaking decade in which he helped build NCSA is its “success in forming collaborative teams, which included artists, computer scientists and application researchers.” It’s a legacy that continues to this day at the center. 1996–2005: Clusters and Collaborations As the center’s second decade opened, John Towns, current NCSA executive associate director of engagement, remembers leadership changes as well as how NCSA received funding. While the center, like others of its kind, continued to receive large block grants, this era saw funding transition to also include more targeted awards of smaller amounts. In the same vein, the center pioneered building commodity compute clusters out of low-cost, low-performance computers working in parallel to accomplish what had been supported by large monolithic computer systems with greater performance, but at a much higher cost. John Towns, NCSA Executive Associate Director of Engagement NCSA led the development of an early production computational cluster: Platinum.John Towns, NCSA executive associate director of engagement This type of cluster, a prototype for the commercial compute clusters to come, is still a staple in high-performance computing and, Towns opines, was the most significant impact on supercomputing during this period. NCSA had other important achievements as a new millennium dawned. The center also “led early investigations into reconfigurable computing and accelerators – the use of GPUs,” or graphics processing units, known at the time as GPGPUs. An NCSA team even developed a cluster using Sony’s PlayStation® 2 (PS2) game consoles, a feat so intriguing it became the subject of popular discussion both within the supercomputing community and well beyond. By 1997, NCSA led the formation of the National Computational Science Alliance, funded by NSF’s Partnerships for Advanced Computational Infrastructure (PACI) program. The Alliance connected researchers at dozens of national institutions and private-sector organizations working to identify emerging trends in global information infrastructure and how those developments could be harnessed to advance research. With collaboration emerging as a major theme, this decade also saw the birth of TeraGrid, a tightly integrated yet distributed information infrastructure allowing scientists and industry researchers across the country access to new capabilities with which to solve scientific problems. As one of the major institutions funded under this NSF project, NCSA became a leading partner and innovator in the largest, most comprehensive infrastructure ever deployed for scientific research at that time, encompassing the world’s fastest unclassified supercomputers, software, ultra-high-speed networks, high-resolution visualization environments and toolkits for grid computing. 2006–2015: The Great Expansion Amy Schuele, NCSA Associate Director of Integrated Cyberinfrastructure This decade was a time of great growth and change for NCSA. It saw the transition from the TeraGrid project to XSEDE, as well as the building of the National Petascale Computing Facility at UIUC and the deployment of Blue Waters and iForge, the center’s first Industry cluster. It was a very special time to be at NCSA.Amy Schuele, current NCSA associate director of integrated cyberinfrastructure Indeed, as Schuele so succinctly remembers, the center continued to build on its success not only in building supercomputers but also in leading important collaborations. In 2008, NSF introduced the eXtreme Digital program as a follow-on to the very successful TeraGrid project. Towns led a team of collaborators spanning 18 institutions over a three-year period in developing and ultimately winning the competition for this funding which created XSEDE, or the Extreme Science and Engineering Discovery Environment, in July 2011. This virtual organization, currently funded into 2022, integrates and coordinates the sharing of supercomputers, high-end visualization and data analysis resources with researchers around the country. “XSEDE built on the success of TeraGrid and similar projects and developed a national ecosystem, creating a destination for researchers to go to find solutions that enabled research that had never been done before,” Schuele explains. The development of XSEDE was of particular importance to her work. “This was the decade that saw the beginnings of XRAS, [XSEDE’s] Resource Allocations as a Service tool, developed by XSEDE, that manages the submission, review and administration of resource requests,” she says. “It has since been adopted by several other organizations, including NCSA.” But another big story developed at NCSA during this era: the building of Blue Waters. Again expanding on its past, in particular the investigation into the use of GPUs for scientific research, in 2007, NSF approved a $208-million grant to fund the creation of the world’s most powerful, leadership-class supercomputer – Blue Waters. Capable of processing more than 13 quadrillion calculations per second, Blue Waters was in continuous operation since 2013, providing more than 39 billion core hours to scientists and engineers around the world. In fact, NCSA built the 88,000-square-foot National Petascale Computing Facility to house it. Michelle Butler, Senior Assistant Director of LSST Blue Waters really pushed the boundaries of the sizes of systems that could be built. The power and liquid cooling infrastructure for the building, the sheer size and scale of the compute and storage, the data-transfer ecosystem throughout, and a novel idea to interface with the science users over chat sessions were all designed specifically for the system.Michelle Butler, senior assistant director, NCSA’s Vera C. Rubin Observatory Project (LSST) Twenty-plus years into its life, scientists weren’t the only ones taking notice of NCSA’s capabilities and achievements. As a leader in visualization, the center was making science more accessible and understandable for everyone. Researchers from UIUC collaborated with visualization experts at NCSA to shed light on how the most violent tornadoes form creating animations that revealed the inner behavior of tornado-producing storms. In 2004, one of the first broadly distributed visualizations “Hunt for the Supertwister” was showcased in an episode of the PBS TV series NOVA. And in 2010, NCSA was ready for its close-up with the debut of “Hubble 3D.” Narrated by Leonardo DiCaprio, the documentary film takes viewers through distant galaxies as it tells the story of the repair and upgrade of the Hubble telescope. Nearly a quarter of the movie’s run time features dramatic voyages realized by NCSA’s Advanced Visualization Lab using real Hubble, astronomical, and computational data in visualizations to bring this space journey to life. The film even received an Oscar® nomination. 2016–2021: Preparing for the Decades to Come Among many important scientific discoveries Blue Waters facilitated, mapping the Earth is among the most unexpected with the potential for the greatest societal impact. Made possible by the ongoing support of the National Science Foundation and $29 million in supplemental funding from the National Geospatial-Intelligence Agency, the global digital elevation modeling (DEM) projects made Blue Waters the most powerful dedicated, geospatial system in the world. A collaboration between Illinois, the University of Minnesota and the Ohio State University, these DEM global-mapping projects leveraged Blue Waters’s unprecedented speed and efficiency, fundamentally changing the way humans view the Earth and how it’s changing. Using DEM data, in 2021 Blue Waters and the NCSA Advanced Visualization Lab created a general public-focused Dome and Ultra-High-Definition documentary called “Atlas of a Changing Earth,” showing how the surface of the earth is changing over time in a way the public can engage with. With Blue Waters’ retirement in December 2021, current NCSA Director William “Bill” Gropp, honored the accomplishments of the system and the expert team behind it. William (Bill) Gropp, NCSA Director Blue Waters leaves an impressive legacy of accomplishments throughout the science, engineering and research communities. Just some of the research areas that saw groundbreaking accomplishments from Blue Waters covered genomics, contagious viruses such as HIV, Zika, and influenza; understanding and predicting black holes and gravity waves; the evolution of galaxies, supernovae, and the universe; political gerrymandering; tornadoes and severe weather; climate change and space weather; earthquakes and volcanoes; understanding photosynthesis and crop yields; satellite-based tree census; mapping and measuring the Earth; and artificial intelligence.William “Bill” Gropp, current NCSA director With Blue Waters’ retirement in December 2021, current NCSA Director William “Bill” Gropp, honored the accomplishments of the system and the expert team behind it. “Blue Waters leaves an impressive legacy of accomplishments throughout the science, engineering and research communities. Just some of the research areas that saw groundbreaking accomplishments from Blue Waters covered genomics, contagious viruses such as HIV, Zika, and influenza; understanding and predicting black holes and gravity waves; the evolution of galaxies, supernovae, and the universe; political gerrymandering; tornadoes and severe weather; climate change and space weather; earthquakes and volcanoes; understanding photosynthesis and crop yields; satellite-based tree census; mapping and measuring the Earth; and artificial intelligence.” The retirement of the venerable Blue Waters system also ushers in the dawn of a new era in supercomputing as NCSA launches its new Delta supercomputer. The system balances cutting-edge GPU and CPU architectures that will shape the future of advanced research computing. Funded by NSF’s Advanced Computing Systems & Services Program, at its launch Delta will be the most performant GPU-computing resource in NSF’s portfolio. In a period largely marked by a global pandemic, NCSA stepped up to the challenge of applying its resources and expertise to address this very unique grand challenge. “Our work developing a laboratory information management system (LIMS) for the university’s COVID-19 SHIELD Test program, as well as supporting the modeling and analysis, helped keep the campus and state safe,” Gropp says. “It was a perfect example of our focus on that last word in our name – Applications.” Indeed, NCSA has always been about more than its hardware, with a software portfolio of nearly 30 applications including IN-CORE, which explores how different communities might recover from the impact of natural disasters. Today, Gropp acknowledges how different NCSA is from its beginnings. “NCSA is a key partner and leader in a much wider range of research areas in advanced computing, with funding from more agencies and industries.” From a single NSF grant to dozens from numerous funding agencies, and a premier Industry program that has grown up along with NCSA, the center is a unique organization ready to take on the next 35 years.