This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
PC Software Performance Tuning
August 1996 (vol. 29 no. 8)
pp. 47-54

PC hardware doubles in processing power every two years, or with each new generation, at approximately constant price. But software has not. Sixteen-bit code developed in the 1980s or early '90s may be slowed two to 20 times by I/O bottlenecks like VGA graphics, artificial data dependencies, poor memory use, obsolete compilers and libraries, and a host of other factors. Software can be designed to scale more readily with greater hardware power, but programmers typically do not profile their code unless it runs "too slowly." Modern compilers provide excellent executables, but developers must choose the best settings of compiler switches, profilers, and optimized runtime libraries. They must also understand the intricacies and idiosyncrasies of the target hardware--in this case, the Intel 486, Pentium, Pentium Pro processors, and the new MMX technology. They must also consider what types of algorithms lend themselves to optimization and what code optimization techniques are most effective. We consider each of these issues before describing a profiling tool called VTune.

Citation:
Mark Atkins, Ramesh Subramaniam, "PC Software Performance Tuning," Computer, vol. 29, no. 8, pp. 47-54, Aug. 1996, doi:10.1109/2.532045
Usage of this product signifies your acceptance of the Terms of Use.