The Community for Technology Leaders
RSS Icon
Issue No.04 - July/August (2008 vol.10)
pp: 60-70
Mohamed Sayeed , Purdue University
Hansang Bae , Purdue University
Yili Zheng , Purdue University
Brian Armstrong , Purdue University
Rudolf Eigenmann , Purdue University
Faisal Saied , Purdue University
The authors discuss the important questions that benchmarking must answer and the degree to which such answers can be given by existing kernel versus real application benchmarks. They describe the state of the art and challenges that must be met to base needed performance measurements on real applications. Finally, they quantify their claims by measuring and comparing several real applications and kernel benchmarks. An important finding is that all three measured computer platforms performed both the best and the worst across the selected applications; "best performance" significantly depended on the problem being solved, questioning the value of computer rankings that use simplistic metrics.
high-performance computing, performance analysis, performance evaluation, performance modeling, kernel benchmarks, real application benchmarks
Mohamed Sayeed, Hansang Bae, Yili Zheng, Brian Armstrong, Rudolf Eigenmann, Faisal Saied, "Measuring High-Performance Computing with Real Applications", Computing in Science & Engineering, vol.10, no. 4, pp. 60-70, July/August 2008, doi:10.1109/MCSE.2008.98
1. A.T. Wong et al., "Esp: A System Utilization Benchmark," Proc. IEEE/ACM SC2000 Conf., IEEE CS Press, 2000; .
2. S.C. Woo et al., "The Splash2 Programs: Characterization and Methodological Considerations," Proc. 22nd Int'l Symp. Computer Architecture, ACM Press, 1995, pp. 24–36.
3. M. Berry et al., "The Perfect Club Benchmarks: Effective Performance Evaluation of Supercomputers," Int'l J. Supercomputer Applications, vol. 3, no. 3, 1989, pp. 5–40.
4. M. Berry, G. Cybenko, and J. Larson, "Scientific Benchmark Characterizations," Parallel Computing, vol. 17, Dec. 1991, pp. 1173–1194.
5. R.W. Hockney and M. Berry, "Parkbench Report: Public International Benchmarking for Parallel Computers," Scientific Programming, vol. 3, no. 2, 1994, pp. 101–146.
6. R. Eigenmann and S. Hassanzadeh, "Benchmarking with Real Industrial Applications: The SPEC High-Performance Group," IEEE Computational Science &Eng., vol. 3, no. 1, 1996, pp. 18–23.
7. R. Eigenmann et al., "Performance Evaluation and Benchmarking with Realistic Applications," SPEC HPG Benchmarks: Performance Evaluation with Large-Scale Science and Engineering Applications, MIT Press, 2001, pp. 40–48.
8. M.S. Mueller et al., "SPEC HPG Benchmarks for High Performance Systems," Int'l J. High-Performance Computing and Networking, vol. 2, no. 1, 2004;
9. V. Aslot et al., "Specomp: A New Benchmark Suite for Measuring Parallel Computer Performance," Proc. Workshop OpenMP Applications and Tools, LNCS 2104, Springer-Verlag, 2001, pp. 1–10.
10. D. Bailey et al., The NAS Parallel Benchmarks 2.0, tech. report NAS-95-020, NASA Ames Research Center, Dec. 1995.
11. B. Armstrong and R. Eigenmann, "Benchmarking and Performance Evaluation with Realistic Applications," A Methodology for Scientific Benchmarking with Large-Scale Applications, MIT Press, 2001, pp. 109–127.
12. D.H. Bailey and A. Snavely, "Performance Modeling: Understanding the Past and Predicting the Future," Proc. 11th Int'l Euro-Par Conf., LNCS 3648, Springer-Verlag, 2005, pp. 761–770.
13. B. Armstrong and R. Eigenmann, "Performance Forecasting: A Methodology for Characterizing Large Computational Applications," Proc. Int'l Conf. Parallel Processing, IEEE Press, 1998, pp. 518–526.
14. L. Carrington, A. Snavely, and N. Wolter, "A Performance Prediction Framework for Scientific Applications," Future Generation Computer Systems, vol. 22, no. 3, 2006, pp. 336–346.
15. M. Mahinthakumar et al., "Performance Evaluation and Modeling of a Parallel Astrophysics Application," Proc. High Performance Computing Symp., Soc. for Computer Simulation Int'l, 2004; www.scs.orgdocInfo.cfm?get=1681.
16 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool