Skip to main content

User reflections: Al Valocchi


What were you studying when you used NCSA’s first supercomputer, the Cray X-MP? What new results came from its use? What other NCSA capabilities helped you advance your science?

My general field of study is the same now as it was 25 years ago—numerical simulation of fluid flow and contaminant transport in the subsurface. Advance scientific computing allows us to consider more complex and more realistic systems—for example, we can study mixtures of contaminants that participate in chemical and biological reactions and we can consider more complex geology. With the Cray X-MP, we were investigating transport of a single contaminant that was reacting with the soil for flow through a ‘layer-cake’ geological system. Back then, this represented the leading edge of complexity.

Using the Cray X-MP allowed us to solve larger problems on finer grids. The numerical results revealed some emergent patterns at large time, and this led to development of new analytical mathematical results for predicting this large-time behavior.

In what ways has access to supercomputers at NCSA advanced your science since then?

In general, access to supercomputers (at NCSA and elsewhere) has permitted study of more complex and realistic problems. As we learn more about the geology, physics, chemistry and microbiology of the sub-surface, we develop more rigorous and sophisticated mathematical models. Numerical solution of these models is a computational ‘grand challenge’ that requires supercomputers. Also, due to the inherent uncertainty about subsurface geology, we need to simulate flow and transport over-and-over multiple times in different alternative and equally likely models of the subsurface geology.

Over the years, my students and I have used a variety of machines and resources at NCSA (including the Connection Machine 5, grid computing, and lately the Lincoln and Abe machines), as well as DOE supercomputers.

In what areas do you envision advances in supercomputing capability having the most impact in your research during the coming 5-10 years?

There will be advances on two fronts. First, as mentioned already, we will consider more complex physical problems. As an example, consider the greenhouse gas mitigation strategy known as ‘carbon capture and storage,’ in which supercritical carbon dioxide is injected into deep saline aquifers. In order to study the feasibility and long-term safety, we need sophisticated three-dimensional models that includes multiple fluid phases, density, temperature and geomechanical effects, many different chemical reactions; moreover the models must consider very large spatial domains (hundreds of kilometers) and long time scales (thousands of years). This is a very challenging computational problem.

The second area of advancement connected to supercomputing will be new methods for uncertainty quantification and parameter estimation. Due to the difficulty of sampling and observing the subsurface, stochastic models are preferred to deterministic models. There have been some new Bayesian methods proposed that seek to find the entire probability distribution of the key quantities, but these methods are computationally expensive.

Is there anything supercomputers enabled you to do 20-25 years ago that you thought was really ‘wow’ and cutting-edge at the time, but that you look back at now and smile at the memory of how ‘high tech’ you thought you were at the time?

I can remember doing some 2D visualizations of our simulations. We were proud of those visualizations, but looking back they were primitive compared to the software and hardware tools available today at a fraction of the cost.

Disclaimer: Due to changes in website systems, we've adjusted archived content to fit the present-day site and the articles will not appear in their original published format. Formatting, header information, photographs and other illustrations are not available in archived articles.

Back to top