Issue No. 08 - August (1999 vol. 32)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/2.781630
<p>MIPS (million instructions per second) has been the most popular measure for computer speed since its inception, and people use it as the primary basis for comparing the speed of different computers. However, the author contends that using the MIPS year as a measurement of the amount of effort needed to break and compare cryptographic keys has led to some very inaccurate estimates. The MIPS year measurement has four basic problems Not all instructions are equal, so how do you really know the MIPS rating of a machine? Not all programs use the same instructions, so different programs can give different measures of speed even for the same machine. The way you measure the MIPS years expended on a single problem by many machines can introduce two possible errors: an inaccurate estimate of machine speed and an inaccurate measurement of time. If you consider only the number of instructions executed, you cannot accurately measure the difficulty of hard problems.</p>
R. D. Silverman, "Exposing the Mythical MIPS Year," in Computer, vol. 32, no. , pp. 22-26, 1999.