Issue No.04 - April (1998 vol.31)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/2.666844
Mission-critical systems must use reliable software. However, assuring software reliability often entails costly and time-consuming development processes. Software quality models can mitigate these costs by predicting module reliability early on, which lets developers focus improvement efforts on modules that require the most attention. Many software quality models use only product metrics such as lines of code or McCabe cyclomatic complexity. This product focus assumes that all modules have a similar process history. For systems that evolve, this assumption is not valid. Modules with similar product measurements may have different quality because of different development histories. For example, a reused module with many changes is likely to have more faults than a similar module with few changes. The authors have developed a quality model based solely on process-history variables. Their study posits that a module?s history prior to integration can help predict the likelihood of fault discovery during integration and test. Such module reliability predictions can be used to focus review, integration, and testing resources on high-risk areas of a system. They report their findings in a case study involving the Joint Surveillance Target Attack Radar System, an embedded, real-time military system developed by Northrop Grumman for the US Air Force in support of the US Army.
Taghi M. Khoshgoftaar, Edward B. Allen, Robert Halstead, Gary P. Trio, Ronald M. Flass, "Using Process History to Predict Software Quality", Computer, vol.31, no. 4, pp. 66-72, April 1998, doi:10.1109/2.666844