This Article 
 Bibliographic References 
 Add to: 
Identifying High Performance ERP Projects
May 2003 (vol. 29 no. 5)
pp. 398-416

Abstract—Learning from high performance projects is crucial for software process improvement. Therefore, we need to identify outstanding projects that may serve as role models. It is common to measure productivity as an indicator of performance. It is vital that productivity measurements deal correctly with variable returns to scale and multivariate data. Software projects generally exhibit variable returns to scale, and the output from ERP projects is multivariate. We propose to use Data Envelopment Analysis Variable Returns to Scale (DEA VRS) to measure the productivity of software projects. DEA VRS fulfills the two requirements stated above. The results from this empirical study of 30 ERP projects extracted from a benchmarking database in Accenture identified six projects as potential role models. These projects deserve to be studied and probably copied as part of a software process improvement initiative. The results also suggest that there is a 50 percent potential for productivity improvement, on average. Finally, the results support the assumption of variable returns to scale in ERP projects. We recommend DEA VRS be used as the default technique for appropriate productivity comparisons of individual software projects. Used together with methods for hypothesis testing, DEA VRS is also a useful technique for assessing the effect of alleged process improvements.

[1] S.N. Afriat, “Efficiency Estimation of Production Functions,” Int'l Economic Rev., vol. 13, pp. 568-598, 1972.
[2] A.J. Albrecht and J.R. Gaffney, “Software Function, Source Lines of Code, and Development Effort Prediction: A Software Science Validation,” IEEE Trans. Software Eng., vol. 9, no. 6, pp. 639-648, 1983.
[3] P. Andersen and N.C. Petersen, “A Procedure for Ranking Efficient Units in Data Envelopment Analysis,” Management Science, vol. 39, no. 10, pp. 1261-1264, 1993.
[4] R.D. Banker, “Maximum Likelihood, Consistency and Data Envelopment Analysis: A Statistical Foundation,” Management Science, vol. 39, no. 10, pp. 1265-1273, 1993.
[5] R.D. Banker, H. Chang, and C.F. Kemerer, “Evidence on Economies of Scale in Software Development,” Information and Software Technology, vol. 36, no. 5, pp. 275-282, 1994.
[6] R.D. Banker, A. Charnes, and W.W. Cooper, “Some Models for Estimating Technical and Scale Inefficiencies,” Management Science, vol. 39, pp. 1261-1264, 1984.
[7] R.D. Banker, S.M. Datar, and C.F. Kemerer, "A Model to Evaluate Variables Impacting the Productivity of Software Maintenance Projects," Management Science, vol. 37, pp. 1-18, Jan. 1991.
[8] R.D. Banker and C.F. Kemerer, "Scale of Economies in New Software Development," IEEE Trans. Software Eng., vol. 15, no. 10, pp. 1,199-1,205, 1989.
[9] V. Basili, L. Briand, and W. Melo, "Measuring the Impact of Reuse on Quality and Productivity in Object-Oriented Systems," Comm. ACM, vol. 39, no. 10, 1996.
[10] C.A. Behrens, “Measuring the Productivity of Computer Systems Development Activities with Function Points,” IEEE Trans. Software Eng., vol. 9, no. 6, pp. 648-652, 1983.
[11] M.L. Berenson and D.M. Levine, Basic Business Statistics—Concepts and Applications. pp. 889-890, Prentice-Hall, 1999.
[12] J.D. Blackburn, G.D. Scudder,, and L.N. Van Wassenhove,"Improving Speed and Productivity of Software Development: A Global Survey of Software Developers," IEEE Trans. Software Eng., vol. 22, no. 12, Dec. 1996, pp. 875-885.
[13] B. Boehm, Software Engineering Economics, Prentice Hall, Upper Saddle River, N.J., 1981, pp. 533-535.
[14] B. Boehm, B. Clark, E. Horowitz, C. Westland, R. Madachy, and R. Selby, “The Cocomo 2.0 Software Cost Estimation Model-A Status Report,” American Programmer, pp. 2-17, 1996.
[15] L.C. Briand, K. El-Emam, and F. Bomarius, “COBRA: A Hybrid Method for Software Cost Estimation, Benchmarking and Risk Assessment,” Proc. 20th Int'l Conf. Software Eng., pp. 390-399, 1998.
[16] F.P. Brooks Jr., The Mythical Man-Month (20th Anniversary Edition).Reading, Mass.: Addison-Wesley, 1995.
[17] G. Schneider and J.P. Winters, Applying Use Cases: A Practical Guide, Addison Wesley Longman, Reading, Mass., 1998.
[18] A. Charnes, W.W. Cooper, and E. Rhodes, “Measuring the Efficiency of Decision Making Units,” European J. Operational Research, vol. 2, pp. 429-444, 1978.
[19] J. Doyle and R. Green, “Strategic Choice and Data Envelopment Analysis: Comparing Computers across Many Attributes,” J. Information Technology, vol. 9, no. 1, pp. 61-69, 1994.
[20] D.M. Fisher and D.B. Sun, “LAN-Based E-Mail: Software Evaluation,” J. Computer Information Systems, vol. 36, no. 1, pp. 21-25, 1995-1996.
[21] F.R. Førsund and L. Hjalmarson, “Generalised Farrell Measures of Efficiency: An Application to Milk Processing in Swedish Dairy Plants,” The Economic J., vol. 89, pp. 294-315, 1979.
[22] R. Jeffery, M. Ruhe, and I. Wieczorek, “Using Public Domain Metrics to Estimate Software Development Effort,” Proc. Seventh IEEE Int'l Metrics Symp., 2001.
[23] C.F. Kemerer, "Reliability of Function Points Measurement: A Field Experiment," Comm. ACM, vol. 36, no. 2, pp. 85-97, Feb. 1993.
[24] M.A. Mahmood, “Evaluating Organisational Efficiency Resulting from Information Technology Investment: an Application of Data Envelopment Analysis,” Information Systems J., vol. 4, no. 2, pp. 93-115, 1994.
[25] Minitab Statistical Software, release 13, , 2000.
[26] S. Moser and O. Nierstrasz, “The Effect of Object-Oriented Frameworks on Developer Productivity,” Computer, pp. 45-51, Sept. 1996.
[27] I. Myrtveit and E. Stensrud, “Benchmarking COTS Projects Using Data Envelopment Analysis,” Proc. METRICS'99, pp. 269-278, 1999.
[28] C. Parkan, K. Lam, and G. Hang, “Operational Competitiveness Analysis on Software Development,” J. Operational Research Soc., vol. 48, no. 9, pp. 892-905, 1997.
[29] C.H. Sackman, W.J. Erikson, and E.E. Grant, "Exploratory Experimental Studies Comparing Online and Offline Programming Performance," Comm. ACM, vol. 11, no. 1, Jan. 1968, pp. 3-11.
[30] M.J. Shepperd and C. Schofield, “Estimating Software Project Effort Using Analogies,” IEEE Trans. Software Eng., vol. 23, pp. 736-743, 1997.
[31] E. Stensrud, T. Foss, B. Kitchenham, and I. Myrtveit, “An Empirical Validation of the Relationship between the Magnitude of Relative Error and Project Size,” Proc. METRICS'02, pp. 3-12, 2002.
[32] E. Stensrud and I. Myrtveit, “The Added Value of Estimation by Analogy—An Industrial Experiment,” Proc. FESMA'98, pp. 549-556, 1998.
[33] E. Stensrud and I. Myrtveit, “Human Performance Estimating with Analogy and Regression Models: An Empirical Validation,” Proc. METRICS'98, pp. 205-213, 1998.
[34] C. Stolp, “Strengths and Weaknesses of Data Envelopment Analysis: An Urban and Regional Perspective,” Computers, Environments and Urban Systems, vol. 14, pp. 103-116, 1990.
[35] S. Thore, F. Phillips, T.W. Ruefli, and P. Yue, “DEA and the Management of the Product Cycle: The U.S. Computer Industry,” Computers and Operational Research, vol. 23, no. 4, pp. 341-56, 1996.
[36] J. Tobin, “Estimation of Relationships for Limited Dependent Variables,” Econometrica, vol. 26, pp. 24-36, 1958.
[37] A.M. Torgersen, F.R. Førsund, and S.A.C. Kittelsen, “Slack Adjusted Efficiency Measures and Ranking of Efficient Units,” J. Productivity Analysis, no. 7, pp. 379-398, 1996.
[38] G.M. Weinberg, The Psychology of Computer Programming, Van Nostrand Reinhold, New York, 1971.

Index Terms:
Software process improvement, benchmarking, best practice identification, software project management, multivariate productivity measurements, data envelopment analysis (DEA), software development, enterprise resource planning (ERP), software metrics, economies of scale, variable returns to scale.
Erik Stensrud, Ingunn Myrtveit, "Identifying High Performance ERP Projects," IEEE Transactions on Software Engineering, vol. 29, no. 5, pp. 398-416, May 2003, doi:10.1109/TSE.2003.1199070
Usage of this product signifies your acceptance of the Terms of Use.