User reflections: John Hawley | National Center for Supercomputing Applications at the University of Illinois

User reflections: John Hawley

09.09.11 -

Q. What were you studying when you used NCSA's first supercomputer, the Cray X-MP? What new results came from its use? What other NCSA capabilities helped you advance your science?

A. At the time I was working on modeling astrophysical jets and the behavior of orbiting rings of gas that were subject to a certain type of global instability. We needed to understand what the consequences were when the amplitude of that instability grew large, something that could only be addressed with simulations. More generally I was interested in extending my work on mass accretion into black holes. I had previously been using a Cray X-MP at Digital Productions in Culver City, California, a place that the National Science Foundation contracted with to obtain supercomputing time prior to the establishment of the centers. During the early years of NCSA I spent a good deal of time pushing forward with algorithm development for compressible magnetohydrodynamic simulations. This laid the groundwork for work to come in uncovering the fundamentally magnetic nature of black hole accretion processes. It also prepared us to go to fully three-dimensional simulations when the hardware advanced far enough.

Q. In what ways has access to supercomputers at NCSA advanced your science since then?

A. It has been an organic, continuous process, best described simply by following the articles listed in a literature search over the last 25 years. Each new platform brought with it new capabilities and new ways to advance algorithms and simulations. The Cray 2, with its astonishingly large amount of memory (an entire gigaword as I recall), led to my first experiments with fully three-dimensional time-dependent simulations. The Connection Machines really got us thinking about large-scale parallel computing, and many of the lessons learned in programming them have come back into vogue with the rise of new technologies such GPU computing.

Q. In what areas do you envision advances in supercomputing capability having the most impact in your research during the coming 5-10 years?

A. I have always felt that among astrophysicists working on accretion research our ultimate goal was to incorporate more and more of the physics that is present in nature but not yet in the simulations. We started with simple ideal gas hydrodynamics in one or two spatial dimensions. We added magnetic fields. We went to 3D. We grew the sizes of the physical domains simulated, added non-ideal effects, radiation transport. Advances in capability mean we can advance the realism in our models.

Q. Is there anything supercomputers enabled you to do 20-25 years ago that you thought was really 'wow' and cutting-edge at the time, but that you look back at now and smile at the memory of how 'high tech' you thought you were at the time?

A. One of the first things I worked on at NCSA, even prior to it opening for business officially (I think I was user ID 10000, the first user ID), was to produce raster color graphics of my (for then) large-scale simulations. Because of bandwidth limitations (remember 1,200 baud modems?) I would compute the grid zone color from the data on the Cray, encode it in ascii-hex, transmit that along with grid information, and then assemble and image using graphics libraries available on the Sun workstation. It was a lot of work but "wow" we got a (low-resolution) color image out of our simulation. We could even make a dozen or so images and flip through them, like an animation. To preserve the images I set up a camera in front of the monitor and took pictures (remember film?). Any of you try to tell the young people that today....