The Community for Technology Leaders
RSS Icon
Issue No.04 - July/August (2009 vol.35)
pp: 534-550
Chris F. Kemerer , University of Pittsburgh, Pittsburgh
This research investigates the effect of review rate on defect removal effectiveness and the quality of software products, while controlling for a number of potential confounding factors. Two data sets of 371 and 246 programs, respectively, from a Personal Software Process (PSP) approach were analyzed using both regression and mixed models. Review activities in the PSP process are those steps performed by the developer in a traditional inspection process. The results show that the PSP review rate is a significant factor affecting defect removal effectiveness, even after accounting for developer ability and other significant process variables. The recommended review rate of 200 LOC/hour or less was found to be an effective rate for individual reviews, identifying nearly two-thirds of the defects in design reviews and more than half of the defects in code reviews.
Code reviews, design reviews, inspections, software process, software quality, defects, software measurement, mixed models, personal software process (PSP).
Chris F. Kemerer, "The Impact of Design and Code Reviews on Software Quality: An Empirical Study Based on PSP Data", IEEE Transactions on Software Engineering, vol.35, no. 4, pp. 534-550, July/August 2009, doi:10.1109/TSE.2009.27
[1] A.F. Ackerman, L.S. Buchwald, and F.H. Lewski, “Software Inspections: An Effective Verification Process,” IEEE Software, vol. 6, no. 3, pp. 31-36, May/June 1989.
[2] F. Akiyama, “An Example of Software System Debugging,” Proc. Int'l Federation for Information Processing Congress 71, pp. 353-359, Aug. 1971.
[3] R.D. Banker, S.M. Datar, and C.F. Kemerer, “A Model to Evaluate Variables Impacting Productivity on Software Maintenance Projects,” Management Science, vol. 37, no. 1, pp. 1-18, Jan. 1991.
[4] D.A. Belsley, E. Kuh, and R.E. Welsch, Regression Diagnostics: Identifying Influential Data and Sources of Collinearity. John Wiley & Sons, 1980.
[5] B.W. Boehm, C. Abts, A.W. Brown, S. Chulani, B.K. Clark, E. Horowitz, R.J. Madachy, D. Reifer, and B. Steece, Software Cost Estimation with COCOMO II. Prentice-Hall, 2000.
[6] K.V. Bourgeois, “Process Insights from a Large-Scale Software Inspections Data Analysis,” Crosstalk: J. Defense Software Eng., vol. 9, no. 10, pp. 17-23, Oct. 1996.
[7] F.W. BreyfogleIII, Implementing Six Sigma: Smarter Solutions Using Statistical Methods. John Wiley & Sons, 1999.
[8] S. Brocklehurst and B. Littlewood, “Techniques for Prediction Analysis and Recalibration,” Handbook of Software Reliability Engineering, M.R. Lyu, ed., pp. 119-166, IEEE CS Press, 1996.
[9] F.O. Buck, “Indicators of Quality Inspections,” IBM Technical Report TR21.802, Systems Comm. Division, Dec. 1981.
[10] D.A. Christenson and S.T. Huang, “A Code Inspection Model for Software Quality Management and Prediction,” Proc. IEEE Global Telecomm. Conf. and Exhibition in Hollywood, pp. 14.7.1-14.7.5, Nov. 1988.
[11] M. Criscione, J. Ferree, and D. Porter, “Predicting Software Errors and Defects,” Proc. 2001 Applications of Software Measurement, pp.269-280. Feb. 2001,
[12] B. Curtis, “The Impact of Individual Differences in Programmers,” Working with Computers: Theory versus Outcome, G.C. van der Veer, ed., pp. 279-294, 1988.
[13] B. Curtis, H. Krasner, and N. Iscoe, “A Field Study of the Software Design Process for Large Systems,” Comm. ACM, vol. 31, no. 11, pp. 1268-1287, Nov. 1988.
[14] W.E. Deming, Out of the Crisis. MIT Center for Advanced Eng. Study, 1986.
[15] T.E. Duncan, S.C. Duncan, L.A. Strycker, F. Li, and A. Alpert, An Introduction to Latent Variable Growth Curve Modeling. Lawrence Erlbaum Assoc., 1999.
[16] S.G. Eick, C.R. Loader, M.D. Long, S.A.V. Wiel, and L.G. Votta, “Estimating Software Fault Content before Coding,” Proc. 14th Int'l Conf. Software Eng., May 1992.
[17] M.E. Fagan, “Design and Code Inspections to Reduce Errors in Program Development,” IBM Systems J., vol. 15, no. 3, pp. 182-211, 1976.
[18] M.E. Fagan, “Advances in Software Inspections,” IEEE Trans. Software Eng., vol. 12, no. 7, pp. 744-751, July 1986.
[19] N. Fenton and M. Neil, “A Critique of Software Defect Prediction Models,” IEEE Trans. Software Eng., vol. 25, no. 5, pp. 675-689, Sept./Oct. 1999.
[20] P. Ferguson, W.S. Humphrey, S. Khajenoori, S. Macke, and A. Matvya, “Results of Applying the Personal Software Process,” Computer, vol. 30, no. 5, pp. 24-31, May 1997.
[21] T. Gilb and D. Graham, Software Inspection. Addison-Wesley, 1993.
[22] G.K. Gill and C.F. Kemerer, “Cyclomatic Complexity Density and Software Maintenance Productivity,” IEEE Trans. Software Eng., vol. 17, no. 12, pp. 1284-1288, Dec. 1991.
[23] R.L. Glass, “Inspections—Some Surprising Findings,” Comm. ACM, vol. 42, no. 4, pp. 17-19, Apr. 1999.
[24] W. Hayes and J.W. Over, “The Personal Software Process (PSP): An Empirical Study of the Impact of PSP on Individual Engineers,” Technical Report CMU/SEI-97-TR-001, Software Eng. Inst., Carnegie Mellon Univ., 1997.
[25] W.S. Humphrey, A Discipline for Software Engineering. Addison-Wesley, 1995.
[26] W.S. Humphrey, Introduction to the Team Software Process. Addison-Wesley, 1999.
[27] IEEE 1028, IEEE Standard for Software Reviews and Audits, IEEE CS, Aug. 2008.
[28] P.M. Johnson and A.M. Disney, “The Personal Software Process: A Cautionary Case Study,” IEEE Software, vol. 15, no. 6, pp. 85-88, Nov./Dec. 1998.
[29] R. Khattree and D.N. Naik, Applied Multivariate Statistics with SAS Software. SAS Publishing, 1999.
[30] L.P.K. Land, “Software Group Reviews and the Impact of Procedural Roles on Defect Detection Performance,” PhD dissertation, Univ. of New South Wales, 2002.
[31] R.C. Littell, G.A. Milliken, W.W. Stroup, and R.D. Wolfinger, SAS System for Mixed Models. SAS Publishing, 1996.
[32] Handbook of Software Reliability Engineering, M.R. Lyu, ed. IEEE CS Press, 1996.
[33] R.T. McCann, “How Much Code Inspection Is Enough?” Crosstalk: J. Defense Software Eng., vol. 14, no. 7, pp. 9-12, July 2001.
[34] J. Neter, M.H. Kutner, and C.J. Nachtscheim, Applied Linear Statistical Models, fourth ed. Irwin, 1996.
[35] D.L. Parnas and D.M. Weiss, “Active Design Reviews: Principles and Practices,” J. Systems Software, vol. 7, no. 4, pp. 259-265, Dec. 1987.
[36] M.C. Paulk, C.V. Weber, B. Curtis, and M.B. Chrissis, The Capability Maturity Model: Guidelines for Improving the Software Process. Addison-Wesley, 1995.
[37] M.C. Paulk, “The Evolution of the SEI's Capability Maturity Model for Software,” Software Process: Improvement Practice, vol. 1, no. 1, pp. 3-15, Spring, 1995.
[38] M.C. Paulk, “An Empirical Study of Process Discipline and Software Quality,” PhD dissertation, Univ. of Pittsburgh, 2005.
[39] J.M. Perpich, D.E. Perry, A.A. Porter, L.G. Votta, and M.W. Wade, “Anywhere, Anytime Code Inspections: Using the Web to Remove Inspection Bottlenecks in Large-Scale Software Development,” Proc. 19th Int'l Conf. Software Eng., pp. 14-21, May 1997.
[40] D.E. Perry, A.A. Porter, M.W. Wade, L.G. Votta, and J. Perpich, “Reducing Inspection Interval in Large-Scale Software Development,” IEEE Trans. Software Eng., vol. 28, no. 7, pp. 695-705, July 2002.
[41] A.A. Porter, H.P. Siy, and L.G. Votta, “Understanding the Effects of Developer Activities on Inspection Interval,” Proc. 19th Int'l Conf. Software Eng., pp. 128-138. May 1997.
[42] A.A. Porter, H.P. Siy, C.A. Toman, and L.G. Votta, “An Experiment to Assess the Cost-Benefits of Code Inspections in Large Scale Software Development,” IEEE Trans. Software Eng., vol. 23, no. 6, pp. 329-346, June 1997.
[43] A.A. Porter and P.M. Johnson, “Assessing Software Review Meetings: Results of a Comparative Analysis of Two Experimental Studies,” IEEE Trans. Software Eng., vol. 23, no. 3, pp. 129-145, Mar. 1997.
[44] A.A. Porter and L.G. Votta, “What Makes Inspections Work?” IEEE Software, vol. 14, no. 6, pp. 99-102, Nov./Dec. 1997.
[45] A.A. Porter, H.P. Siy, A. Mockus, and L.G. Votta, “Understanding the Sources of Variation in Software Inspections,” ACM Trans. Software Eng. and Methodology, vol. 7, no. 1, pp. 41-79, Jan. 1998.
[46] L. Prechelt and B. Unger, “An Experiment Measuring the Effects of Personal Software Process (PSP) Training,” IEEE Trans. Software Eng., vol. 27, no. 5, pp. 465-472, May 2000.
[47] R.A. Radice, High Quality Low Cost Software Inspections. Paradoxicon Publishing, 2002.
[48] J.O. Rawlings, S.G. Pantula, and D.A. Dickey, Applied Regression Analysis, second ed. Springer-Verlag, 1998.
[49] M. Takahasi and Y. Kamayachi, “An Empirical Study of a Model for Program Error Prediction,” Proc. Eighth Int'l Conf. Software Eng., pp. 330-336. Aug. 1985.
[50] E.F. Weller, “Lessons from Three Years of Inspection Data,” IEEE Software, vol. 10, no. 5, pp. 38-45, Sept. 1993.
[51] A. Wesslen, “A Replicated Empirical Study of the Impact of the Methods in the PSP on Individual Engineers,” Empirical Software Eng., vol. 5, no. 2, pp. 93-123, June 2000.
[52] C. Withrow, “Error Density and Size in Ada Software,” IEEE Software, vol. 7, no. 1, pp. 26-30, Jan. 1990.
[53] C. Wohlin and P. Runeson, “Defect Content Estimations from Review Data,” Proc. 20th Int'l Conf. Software Eng., pp. 400-409, Apr. 1998.
[54] C. Wohlin and A. Wesslen, “Understanding Software Defect Detection in the Personal Software Process,” Proc. Ninth Int'l Symp. Software Reliability, pp. 49-58, Nov. 1998.
[55] C. Wohlin, “Are Individual Differences in Software Development Performance Possible to Capture Using a Quantitative Survey?” Empirical Software Eng., vol. 9, no. 3, pp. 211-228, Sept. 2004.
19 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool