Michael L. Norman

1999 Sidney Fernbach Award Recipient
____________________________________________



“For his leading edge research in applying parallel computing to challenge grand problems in astrophysics and cosmology”


Michael L. Norman
is a computational astrophysicist who has been a professor of astronomy and director of the Laboratory for Computational Astrophysics at the National Center for Supercomputing Applications at the University of Illinois, Urbana-Champaign since 1992.  He earned his bachelor’s degree in astronomy at Caltech, and master’s and doctorate degrees in engineering and applied science from UC Davis while a student employee at the Lawrence Livermore National Laboratory.

Norman develops numerical algorithms and application codes and applies them to model complex astrophysical systems and processes involving plasmas, magnetic fields, radiation and gravitation on multiple scales.  In the 1980s Norman did the pioneering work on simulations of astrophysical jets from young stars and active galaxies.  Since 1991 Norman’s work has focused almost exclusively on developing three-dimensional multiscale simulations of the formation of cosmological structure incorporating the physical complexities mentioned above.  These models are used to study the information of the first generation of stars, galaxy formation and evolution, the structure and thermal history of the intergalactic medium, and the formation of X-ray clusters of galaxies and large scale structure.

These models are used to understand the cosmogonic processes that give rise to the wealth of structures observed in our universe at low, intermediate, and high redshifts, as well as to predict what will become observable with the next generation of astronomical observations and surveys.  Careful modeling of the physical processes occurring in the cosmic plasma (e.g., shocks, ionization, radioactive cooling, etc) is a requirement for making quantitative predictions that are testable by observational means.  Besides providing information on the origin and evolution of the astronomical object themselves, the statistical properties of large ensembles of galaxies and clusters can be used to measure the fundamental cosmological parameters describing the global expansion rate, mass content, and fate of the universe as a whole.

Among Norman’s significant scientific results are: (1) the elucidation the physical nature of the Lymanalphat forest – a thicket of optical absorption lines seen in the spectra of high redshift quasars which result from the complex yet predictable structure of the interviewing intergalactic plasma; (2) the first self-consistent 3D simulations of the formation history and mass of the first stars in the universe; and (3) the computational prediction of strong turbulent motions in X-ray clusters of galaxies.

Norman has made several noteworthy contributions to computational astrophysics and high-performance computing beyond his applications research.  The first is the development, with student Jim Stone, of new multidimensional finite-difference algorithms for magnetohydrodynamics (the MOC-CT algorithm) and radiation hydrodynamics (the VTEF algorithm) and their implementation into a suite of multiphysics application codes.  The shared memory codes SEUZ-2D and ZEUS-3D are in broad use by the international astrophysics community, while the distributed memory massively parallel version ZEUS-MP is about to be released.

The second is the development with student Greg Bryan, of a higher-order Godunov scheme for cosmological hydrodynamics based on the piecewise parabolic method (PPM).  This scheme is designed to handle the fantastic compressions and Mach numbers developed by gravitational clustering in an expanding universe.  This algorithm is incorporated in the 3D hydrodynamic cosmology code KRONOS, which is a hybrid code that also includes a standard particle-mesh algorithm for collision less dark matter and a FFT-based Poisson solver.  A scalable data-parallel implementation of KRONOS for the Connection Machine-5 massively parallel computer was used to carry out in 1994 the largest cosmological simulation ever attempted, only recently surpassed in size by a German group in 1998.

The third is the development, also with student Greg Bryan, of a parallel adaptive mesh refinement (AMR) code for hydrodynamic cosmological simulations in 3D.  The code is based on an adaption of the structured AMR algorithm of Berger and Collela for shock hydrodynamics, combined with a particle-hierarchical mesh (PHM) scheme for gravitational N-body dynamics.  The object-oriented ENZO code is the only one of its kind in existence, and is unique in its ability to simulate a large range of spatial scales in a dynamic 3D computation.  A recent calculation of the formation of the first stars in the universe achieved a spatial dynamic range per dimension of over 30,000,000 using a 19-level fully adaptive hierarchy of grids.

Norman has also been at the leading edge of applying distributed heterogeneous computing to the solution of astrophysical problems.  In 1995, he led a team of astrophysicists and computer scientists to carry out a transcontinental simulation of colliding galaxies using three different supercomputers connected on the I-WAY experimental wide-area high speed network.  For this work his team won an HPC Challenge award at Supercomputing 95 entitle “Galaxies Collide on the I-WAY: An Example of Wide Area Collaborative Computing.”  For Supercomputing 99 he is leading a new team competing in the HPC Games contest to use the emerging GLOBUS computational grid to carry out a large parametric survey of AMR cosmological simulations, each of which is parallel, distributed job.

Norman’s research is not confined to astrophysics and cosmology.  In collaboration with researchers at the Lawrence Livermore National Laboratory, he is developing scalable parallel algorithms for high-fidelity 3D radiation hydrodynamics simulations.  He is also a fluid dynamics co-team leader in the University of Illinois’ Center for the Simulation of Advanced Rockets, where he is developing a 3D radioactive heat transfer module for the core flow of aluminized solid rocket propellants.

“The best is yet to come” says Normal, referring to the coming capability to carry out multiscale simulations of realistic complexity on terascale computing platforms.  “With the continual advances in hardware, software and algorithms, we will soon be able to create a “cosmos in a computer” and sail through it in space and time to explore how our fantastic universe of stars and galaxies came into being and see where it is going.”