The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.11 - November (2005 vol.54)
pp: 1360-1373
ABSTRACT
Due to cost, time, and flexibility constraints, computer architects use simulators to explore the design space when developing new processors and to evaluate the performance of potential enhancements. However, despite this dependence on simulators, statistically rigorous simulation methodologies are typically not used in computer architecture research. A formal methodology can provide a sound basis for drawing conclusions gathered from simulation results by adding statistical rigor and, consequently, can increase the architect's confidence in the simulation results. This paper demonstrates the application of a rigorous statistical technique to the setup and analysis phases of the simulation process. Specifically, we apply a Plackett and Burman design to: 1) identify key processor parameters, 2) classify benchmarks based on how they affect the processor, and 3) analyze the effect of processor enhancements. Our results showed that, out of the 41 user-configurable parameters in SimpleScalar, only 10 had a significant effect on the execution time. Of those 10, the number of reorder buffer entries and the L2 cache latency were the two most significant ones, by far. Our results also showed that Instruction Precomputation—a value reuse-like microarchitectural technique— primarily improves the processor's performance by relieving integer ALU contention.
INDEX TERMS
Index Terms- Performance analysis and design aids, measurement techniques, simulation output analysis.
CITATION
Joshua J. Yi, David J. Lilja, Douglas M. Hawkins, "Improving Computer Architecture Simulation Methodology by Adding Statistical Rigor", IEEE Transactions on Computers, vol.54, no. 11, pp. 1360-1373, November 2005, doi:10.1109/TC.2005.184
21 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool