The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.02 - March/April (2000 vol.2)
pp: 14-16
Published by the IEEE Computer Society
ABSTRACT
In a recent essay, John Rice listed a set of obvious and less-than-obvious challenges facing computational science and engineering researchers in the 21st century, given the rapid growth in raw computing power. Obvious challenges included dimensionality extension, finer scales, better mathematical models, parallel computing, and algorithms. Less-than-obvious challenges included multiphysics and multiscale phenomena, software, model and software validation, computational intelligence, and a language for computational science. In this issue, we report on a set of five advanced simulation research efforts, each of which requires that all-or almost all-of these challenges be addressed.
In a recent essay, 1 John Rice listed a set of obvious and less-than-obvious challenges facing computational science and engineering researchers in the 21st century, given the rapid growth in raw computing power. Obvious challenges included dimensionality extension, finer scales, better mathematical models, parallel computing, and algorithms. Less-than-obvious challenges included multiphysics and multiscale phenomena, software, model and software validation, computational intelligence, and a language for computational science. In this issue, we report on a set of five advanced simulation research efforts, each of which requires that all—or almost all—of these challenges be addressed.
The US Department of Energy initiated the university research efforts reported herein as a part of its ASCI2 Academic Strategic Alliance Program in September 1997 with five-year contracts, each averaging about $4 million per year. This program's goals are to solve science and engineering problems of national importance through the use of large-scale, multidisciplinary modeling and simulation and to establish and validate large-scale, multidisciplinary modeling and simulation as a viable scientific methodology across applications requiring coupled multidisciplinary, multiscale, complex simulation sequences.
The purpose of supporting this strategically chosen research in US universities is to effect a major acceleration in the development of modeling and simulation capabilities and of computer and computational technologies, as well as in the education and training of individuals who will become future leaders in simulation science. This of course supports the long-term research and training needs of the DoE's ASCI and Science-Based Stockpile Stewardship Program.
The five advanced simulation research efforts are instantiated as Research Centers of Excellence at five universities.
Researchers at the Center for Simulating the Dynamic Response of Materials at the California Institute of Technology are investigating the effect of shock waves induced by high explosives on various materials in different phases. The results of this work will prove beneficial in a number of civilian practices that employ high explosives. The work will also enable advances in material design and have applications in other areas such as geophysics.
At the University of Chicago, the Center for Astrophysical Thermonuclear Flashes aims to solve the long-standing problem of astrophysical thermonuclear flashes through simulation and analysis. Its efforts are adding to the body of scientific knowledge of how the universe and, in particular, the Earth were formed. Research results will provide further understanding of the physical problems of nuclear ignition, detonation, and turbulent mixing of complex multicomponent fluids and other materials.
The focus of the Center for Simulation of Advanced Rockets at the University of Illinois at Urbana-Champaign is on detailed, whole-system simulation of solid propellant rocket motors under both normal and abnormal operating conditions. This requires expertise in diverse subdisciplines including propellant ignition and combustion, fluid dynamics of interior flow, structural response of solid components, and analysis of various potential failure modes. These problems are characterized by very high energy densities; extremely diverse length and time scales; complex and dynamically changing geometries and interfaces; and turbulent, reactive, and multiphase flows.
Stanford University's Center for Integrated Turbulence Simulations is studying the development of simulation technology suitable for the design of gas turbine engines that can propel airplanes, drive locomotives, impel boats, and deliver power for many other applications. With the new design paradigm, we can shorten the design cycle, reduce expensive testing, and improve reliability. Other benefits include improved understanding of compressible flow computations, turbulence, and transport modeling.
The University of Utah's Center for Simulation of Accidental Fires and Explosions work is providing a set of state-of-the-art, science-based tools for numerical simulation of accidental fires and explosions, especially within the context of handling and storing highly flammable materials. The simulation study is contributing improved understanding of fire safety and accident scenarios. Anticipated benefits are reduced risk, increased safety, and potential remedies in situations such as industrial chemical fires, the handling and transportation of highly flammable materials, car crashes, and terrorist attacks.
It is clear from these brief descriptions that researchers working in each of the centers are addressing extremely complex multidisciplinary and multiscale applications. New and improved 3D mathematical models that include the necessary physics and chemistry are required. Researchers must implement these models with numerical algorithms and techniques that address the widely varying scales in both time and space. Resulting component codes must be coupled across disciplinary boundaries and ultimately integrated into an overarching code representing the simulation of the application. This requires special attention to the design and implementation of software frameworks for code coupling and integration, including the language required for expressing the frameworks that hide implementation details from the applications researchers. The centers have access to 10% of the ASCI computing resources on which to conduct the required simulations. Figure 1 shows how dramatically these resources are scheduled to increase in the future.


Figure 1. Projected growth plan for ASCI computing resources.

The scope and scale of these applications requires hundreds of hours of the highest-performance computing systems available. During the present year, the centers estimate that they will require between six and seven million node hours (a node hour is one hour on one node of a parallel system). For example, the University of Illinois center plans a simulation of a rocket burn for 24 seconds. This will be done with 30 million cells and use 450,000 node hours on 1,024 processors. Details on other center simulation plans can be found in the individual articles.
To accomplish simulations at these levels, researchers must address the highest level of parallelism available, which in turn means facing unprecedented algorithmic and software challenges. Given the variety of choices that must be made before and during these complex simulations, intelligence must be introduced into the software to steer the simulations to accurate approximate solutions. The sheer volume of simulation results that must be visualized and analyzed requires special efforts in data handling and visualization. Software and model validation efforts are essential for confidence in the resulting simulations.
Because of the scope, scale, and complexity of the applications and the necessity for researchers from several disciplines and departments to work closely together, the research conducted in these five centers and their overall management is atypical of research conducted in most universities. More atypical is the measure of success that will be applied to their efforts, namely the development of an overarching simulation that leads to results—validated to the extent possible—that capture complex system behavior. With just two years into these up-to-10-year efforts—namely, the articles in this issue report on the scientific challenges and the progress the centers have made to date in moving toward accomplishing their long-term goals.
In addition to the five theme articles, we include an interview with Gil Weigand, Acting Deputy Assistant Secretary for Research, Development, and Simulation, DoE Energy Defense Programs, and Paul Messina, the program's current director. Weigand is the ASCI program's chief architect—the ASCI Academic Alliances was part of his vision for the program. We also include an article written by Dona Crawford, Donald McCoy, and David Nowak, the Sandia, LANL, and LLNL ASCI project managers, which describes benefits that the ASCI will derive from the Alliances program.
We hope that the combination of these introductory remarks, the interview, and theme articles will provide you with a clearer perspective on the challenges simulation scientists and engineers face. Some of the benefits that can result in advancing science and engineering frontiers are of significant importance to the US.
We thank the ASCI Academic Strategic Alliances Strategy Team, including Beverly Berger (DoE Headquarters), Charles Hartwig (Sandia), Warner Miller (LANL), and Richard Watson (LLNL).

References

Merrell Patrick is a computer and computational scientist who is a special assistant to the vice president for research at the University of Utah and is also a consultant to LANL. He recently retired from the National Science Foundation, where he served as chief science and technology officer for the Directorate for Computer Information Science and Engineering. He received a Distinguished Service Award from the Computing Research Association in 1998, and he is a member of the ACM, the IEEE Computer Society, and SIAM. Contact him at mpatr@concentric.net.
Robert Voigt is the director of the Computational Science Cluster at the College of William & Mary, and a consultant to the Department of Energy's ASCI program through the Oak Ridge Institute for Science and Education. His research interests are in high-performance computing and parallel algorithms. He received a BA from Wabash College, an MS from Purdue University, and a PhD in mathematics from the University of Maryland. He is a member of the IEEE, the ACM, and SIAM. Contact him at the Computational Science Cluster, Computer Science Dept., College of William & Mary, Williamsburg, VA 23089; rvoigt@compsci.wm.edu.
15 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool