1521-9615/09/$31.00 © 2009 IEEE
Published by the IEEE Computer Society
The Promise of Science-Based Computational Engineering
The Field of Engineering is Poised to Enter A New and Exciting Era. The Exponential Growth in Computing Capability From one Floating-Point Operation Per Second (Flops) in 1945 to 10 15 in 2008 is Helping us Replace The Standard Engineering
process of iterated empirical design–build–test cycles with an iterated design–mesh–analyze paradigm based on physics–based computational tools. It helps engineers be more productive and also helps manufacturers reduce time-to-market and design costs, better respond to changing market conditions, and increase their technical workforce and testing facilities' productivity.
One real-world example illustrates what we can now achieve with this paradigm. In the early 1990s, Goodyear Tire faced intense international competition. Its rivals had more engineering design resources, testing capacity, and lower production costs—Goodyear was rapidly falling behind. To respond and develop a competitive advantage, it replaced the traditional engineering process (design, build, test, and repeat) that had served it well for more than 100 years with physics-based computational engineering tools to design, mesh, and analyze new products. Engineers built and tested just the final, optimized designs, thereby reducing Goodyear's time to market from three years to less than a year. The company started producing several new designs a year instead of one or two every few years. Goodyear is now the largest US tire manufacturer and is competitive in the world market. Whirlpool, Proctor and Gamble, Boeing, Ping Golf, and Pratt and Whitney, to name a few, have also adopted this new paradigm with similar success.
This revolution is urgently needed. Today's US dominance of advanced technology isn't guaranteed tomorrow, and the rest of the world is rapidly catching up. US industry continues to lose market share in the US and abroad, and every spring, the Government Accounting Office's report on 40 US major weapon system procurements concludes that most are behind schedule, over budget, and fail to meet performance goals.
To survive, the US must improve the way it develops and delivers advanced technologies. The design–build–test paradigm requires large engineering staffs and extensive test facilities, takes a long time, and is costly. New designs typically are based on empirical extrapolations, which often results in products with major flaws and without sufficient innovation to compete. In contrast, physics-based computational engineering tools for iterated design, meshing, and analysis of "virtual prototypes" can result in innovative products that work. The laws of physics are simply better than empirical "rules of thumb" for designing products based on new materials and concepts: engineers can develop optimized and tested designs more quickly, make less use of test facilities, and enhance their productivity
But, just as in every past technology revolution, successful adoption of this paradigm faces many challenges. The major bottlenecks are the time and resources required to develop, deploy, and support these tools, which use complex computers to integrate many complex physics and engineering effects to solve complex problems. Generally, it takes multidisciplinary teams of 20 or more engineers, programmers, and computer scientists five to 10 years to develop and deploy such tools, which must include the right physics and engineering to provide accurate, reliable answers. Moreover,
• the tools must be verified and validated;
• problems must be easy to set up and run;
• geometry and mesh generation must be quick and easy;
• the tools must run efficiently on highly complex, massively parallel computers; and
• the results must be timely.
Additionally, each tool is highly specialized to each application, so a computational tire design tool probably isn't very useful for airplane design. The complexity of the tools results in a failure rate for their development that ranges from 25 to 50 percent.
Most companies don't have the time, resources, or expertise to develop their own computational tools, so they tend to rely on commercial or other externally developed tools, but most independent software vendors can't make a business case for the large, long-term investment needed to produce such tools. These tools also need a high level of support, including continual upgrades, maintenance, and porting to new platforms, all of which results in their slow development and adoption.
It's not all doom and gloom, though—a growing community is working to overcome those challenges, and the number of industries willing to invest in such tools is increasing. The US Department of Defense has launched a program to develop computational engineering tools as a key element of its effort to improve the acquisition of major weapons systems, and the National Science Foundation, NASA, and the Department of Energy are investing in computational science and engineering. The use of computational tools such as Matlab is becoming a standard part of the university engineering curriculum, and the power of workstations continues to improve in lockstep with supercomputers, meaning the computing power in today's supercomputers will be available in workstations in five to 10 years.
The history of technology shows that all major engineering paradigm shifts have been fraught with challenges and difficulties. Overcoming them often takes several generations, if not a century or two. These challenges provide the opportunity for today's engineers to be leaders in effecting a major engineering paradigm shift and for the next generation of engineers to have an exciting and productive time exploiting this new paradigm.
Douglass Post is the chief scientist at the Department of Defense High Performance Computing Modernization Program and is an associate editor in chief of this magazine. Contact him at email@example.com.