Skip to main content

User reflections: Michael Norman


Preparations for observing NCSA’s 25th anniversary this year revealed several users who computed at NCSA in 1986 who are still active users today. We invited them to share their thoughts on advances in high-performance computing.

Michael Norman, University of California, San Diego

[Editor’s note: Michael Norman, director of the San Diego Supercomputer Center, was, until 2000, an astronomy professor at the University of Illinois and an associate director and senior research scientist at NCSA.]

Q. What were you studying when you used NCSA’s first supercomputer, the Cray X-MP? What new results came from its use? What other NCSA capabilities helped you advance your science?

A. I was studying the structure and dynamics of extragalactic jets using 2D supersonic gas dynamic simulations. Using the X-MP, I was able to simulate considerably larger domains at high resolution and examine non-axisymmetric (firehose) instabilities. I worked with Donna Cox’s visualization group to make striking animations of the instabilities, which elucidated the dynamics in a beautiful and informative way.

Q. In what ways has access to supercomputers at NCSA advanced your science since then?

A. Countless ways. Between 1986 and 2000, when I left NCSA, I used everything NCSA put on the floor to increase the physics fidelity of my extragalactic jet simulations. The Cray-2 and Connection Machine-2 allowed me to go to 3D simulations, and eventually incorporate magnetic fields. The Connection Machine-5 deployed in 1992 launched my research on cosmological structure formation, which continues to this day. The SGI clusters in the mid to late 1990s provided an ideal platform for developing the Enzo adaptive mesh refinement code for hydrodynamic cosmology, which is my main research tool today. Since 2000 I have used many NCSA resources to study galaxy clusters, the intergalactic medium, the formation of the first stars, and interstellar turbulence.

Q. In what areas do you envision advances in supercomputing capability having the most impact in your research during the coming 5-10 years?

A. Blue Waters will take my numerical cosmology research to the next level. Multiphysics AMR simulations of star and galaxy formation are very compute intensive; the speed and number of POWER7 processors that will be available will enable high-fidelity simulations of the formation of the first galaxies and reionization. We are already drowning in the data volumes current generation supercomputers produce. Data-intensive computer architectures with large amounts of shared memory and flash SSD will be ideal data analytics platforms. HPC systems with more productive parallel programming languages and tools would also have a big impact.

Q. Is there anything supercomputers enabled you to do 20-25 years ago that you thought was really ‘wow’ and cutting-edge at the time, but that you look back at now and smile at the memory of how ‘high tech’ you thought you were at the time?

A. Yes. The transition from NCSA’s Cray Y-MP to the Cray-2 was a “wow” moment for me. Its large memory (128 MW!) allowed me to move from 2D to 3D simulations of extragalactic radio jets—from cartoonish flatland models to astrophysically realistic-looking models. The Cray-2 stimulated the development of the ZEUS-3D code in my lab from 1989 to 1992. ZEUS-3D went on to be the first open-source community code in astrophysics, and it is still in use around the world. At the time, around 1988, I remember being quoted in the local paper saying that trying to fit a 3D simulation into the YMP’s memory was like fitting an elephant into a tutu. That quote made it into the quotes of the week. That makes me smile when I think of it.

Disclaimer: Due to changes in website systems, we've adjusted archived content to fit the present-day site and the articles will not appear in their original published format. Formatting, header information, photographs and other illustrations are not available in archived articles.

Back to top