This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Exposing the Mythical MIPS Year
August 1999 (vol. 32 no. 8)
pp. 22-26

MIPS (million instructions per second) has been the most popular measure for computer speed since its inception, and people use it as the primary basis for comparing the speed of different computers. However, the author contends that using the MIPS year as a measurement of the amount of effort needed to break and compare cryptographic keys has led to some very inaccurate estimates. The MIPS year measurement has four basic problems Not all instructions are equal, so how do you really know the MIPS rating of a machine? Not all programs use the same instructions, so different programs can give different measures of speed even for the same machine. The way you measure the MIPS years expended on a single problem by many machines can introduce two possible errors: an inaccurate estimate of machine speed and an inaccurate measurement of time. If you consider only the number of instructions executed, you cannot accurately measure the difficulty of hard problems.

Citation:
Robert D. Silverman, "Exposing the Mythical MIPS Year," Computer, vol. 32, no. 8, pp. 22-26, Aug. 1999, doi:10.1109/2.781630
Usage of this product signifies your acceptance of the Terms of Use.