The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.04 - July/August (2008 vol.34)
pp: 471-484
Jacky Wai Keung , National ICT Australia Ltd. The University of New South Wales, Sydney
Barbara A. Kitchenham , National ICT Australia, Sydney
David Ross Jeffery , National ICT Australia Ltd. The University of New South Wales, Sydney
ABSTRACT
Abstract?Data-intensive analogy has been proposed as a means of software cost estimation as an alternative to other data intensive methods such as linear regression. Unfortunately, there are drawbacks to the method. There is no mechanism to assess its appropriateness for a specific dataset. In addition, heuristic algorithms are necessary to select the best set of variables and identify abnormal project cases. We introduce a solution to these problems based upon the use of the Mantel correlation randomization test called Analogy-X. We use the strength of correlation between the distance matrix of project features and the distance matrix of known effort values of the dataset. The method is demonstrated using the Desharnais dataset and two random datasets, showing (1) the use of Mantel?s correlation to identify whether analogy is appropriate, (2) a stepwise procedure for feature selection, as well as (3) the use of a leverage statistic for sensitivity analysis that detects abnormal data points. Analogy-X, thus, provides a sound statistical basis for analogy, removes the need for heuristic search and greatly improves its algorithmic performance.
INDEX TERMS
Cost estimation, Management, Statistical methods, Software Engineering
CITATION
Jacky Wai Keung, Barbara A. Kitchenham, David Ross Jeffery, "Analogy-X: Providing Statistical Inference to Analogy-Based Software Cost Estimation", IEEE Transactions on Software Engineering, vol.34, no. 4, pp. 471-484, July/August 2008, doi:10.1109/TSE.2008.34
REFERENCES
[1] C. Schofield, “Software Support for Cost Estimation by Analogy,” Proc. Sixth European Software Cost Modeling Conf., 1995.
[2] C. Schofield and M.J. Shepperd, “Effort Estimation by Analogy: A Case Study,” Proc. Seventh European Software Control and Metrics Conf., 1996.
[3] M.J. Shepperd, C. Schofield, and B. Kitchenham, “Effort Estimation Using Analogy,” Proc. 18th Int'l Conf. Software Eng., 1996.
[4] B.W. Boehm, “Software Engineering Economics,” IEEE Trans. Software Eng., vol. 10, pp. 4-21, 1984.
[5] M.J. Shepperd and C. Schofield, “Estimating Software Project Effort Using Analogies,” IEEE Trans. Software Eng., vol. 23, pp.736-743, 1997.
[6] T. Mukhopadhyay, S. Vincinanza, and M.J. Pietula, “Estimating the Feasibility of a Case-Based Reasoning Model for Software Effort Estimation,” MIS Quarterly, vol. 16, pp. 155-171, 1992.
[7] K. Atkinson and M.J. Shepperd, “The Use of Function Points to Find Cost Analogies,” Proc. European Software Cost Modeling Meeting, 1994.
[8] E. Stensrud and I. Myrtveit, “The Added Value of Estimation by Analogy: An Industrial Experiment,” Proc. Ninth European Software Control and Metrics Conf., 1998.
[9] F. Walkerden, “An Empirical Study of Analogy-Based Software Effort Estimation,” Empirical Software Eng., vol. 4, pp. 135-158, 1999.
[10] L. Angelis and I. Stamelos, “A Simulation Tool for Efficient Analogy Based Cost Estimation,” Empirical Software Eng., vol. 5, pp. 35-68, 2000.
[11] E. Mendes, N. Mosley, and S. Counsell, “Early Web Size Measures and Effort Prediction for Web Costimation,” Proc. Ninth Int'l Software Metrics Symp., pp. 18-39, 2003.
[12] E. Mendes and B. Kitchenham, “Further Comparison of Cross-Company and Within-Company Effort Estimation Models for Web Applications,” Proc. 10th Int'l Software Metrics Symp., pp. 348-357, 2004.
[13] I. Myrtvelt and E. Stensrud, “A Controlled Experiment to Assess the Benefits of Estimating with Analogy and Regression Models,” IEEE Trans. Software Eng., vol. 25, pp. 510-525, 1999.
[14] L.C. Briand, K. El Emam, D. Surmann, I. Wieczorek, and K.D. Maxwell, “An Assessment and Comparison of Common Software Cost Estimation Modeling Techniques,” Proc. 21st Int'l Conf. Software Eng., pp. 313-323, 1999.
[15] R. Jeffery, M. Ruhe, and I. Wieczorek, “Using Public Domain Metrics to Estimate Software Development Effort,” Proc. Seventh Int'l Software Metrics Symp., 2001.
[16] C. Mair and M. Shepperd, “The Consistency of Empirical Comparisons of Regression and Analogy-Based Software Project Cost Prediction,” Proc. Fourth Int'l Symp. Empirical Software Eng., pp. 491-500, 2005.
[17] M.J. Shepperd and G. Kadoda, “Using Simulation to Evaluate Prediction Techniques,” Proc. Seventh Int'l Software Metrics Symp., 2001.
[18] M.J. Shepperd and G. Kadoda, “Comparing Software Prediction Techniques Using Simulation,” IEEE Trans. Software Eng., vol. 27, no. 11, pp. 1014-1022, Nov. 2001.
[19] M. Jorgensen and D. Sjoberg, “Expert Estimation of Software Development Work,” Software Evolution and Feedback: Theory and Practice. Wiley, 2006.
[20] M. Jorgensen, “A Review of Studies on Expert Estimation of Software Development Effort,” J. Systems and Software, vol. 70, pp.37-60, 2004.
[21] M. Jorgensen, “Practical Guidelines for Expert-Judgement-Based Software Effort Estimation,” IEEE Software, vol. 22, pp. 57-63, 2005.
[22] M. Jorgensen, “Estimation of Software Development Work Effort: Evidence on Expert Judgement and Formal Models,” Int'l J. Forecasting, 2007.
[23] C. Kirsopp, M. Shepperd, and J. Hart, “Search Heuristics, Case-Based Reasoning and Software Project Effort Prediction,” Proc. Genetic and Evolutionary Computation Conf., pp. 1367-1374, 2002.
[24] J. Li, G. Ruhe, A. Al-Emran, and M.M. Richter, A Flexible Method for Software Effort Estimation by Analogy, Empirical Software Eng., http://www.kluweronline.com/issn1382-3256 , Apr. 2006.
[25] T. Foss, E. Stensrud, B. Kitchenham, and I. Myrtveit, “A Simulation Study of the Model Evaluation Criterion MMRE,” IEEE Trans. Software Eng., vol. 29, pp. 985-995, 2003.
[26] G.F. Kadoda, M. Cartwright, and M.J. Shepperd, “Issues on the Effective Use of CBR Technology for Software Project Prediction,” Proc. Fourth Int'l Conf. Case-Based Reasoning: Case-Based Reasoning Research and Development), pp. 276-290, 2001.
[27] N. Mantel, “The Detection of Disease Clustering and a Generalized Regression Approach,” Cancer Research, vol. 27, pp. 209-220, 1967.
[28] B.F.J. Manly, Randomization, Bootstrap and Monte Carlo Methods in Biology, second ed. Chapman & Hall/CRC, 1997.
[29] P. Legendre and L. Legendre, Numerical Ecology, second ed. Elsevier, 1998.
[30] B.F.J. Manly, Multivariate Statistical Methods—A Primer, second ed. Chapman & Hall/CRC, 1998.
[31] F.H.C. Marriott, “Barnard's Monte Carlo Tests: How Many Simulations?” Applied Statistics, vol. 28, 1979.
[32] R-Project, “The R Project for Statistical Computing,” http:/www.r-project.org, 2005.
[33] ADE4, “Ecological Data Analysis (ADE4) Package for R,” http://pbil.univ-lyon1.frADE-4/, 2004.
[34] VEGAN, “Vegan: R Functions for Community Ecology,” http://cc.oulu.fi/~jarioksa/softhelpvegan.html , 2004.
[35] J.W. Tukey, “Accurate Confidence Interval for the Ratio of Specific Occurrence/Exposure Rates in Risk and Survival Analysis,” Biometrical J., vol. 37, p. 611, 1958.
[36] B. Efron and G. Gong, “A Leisurely Look at the Bootstrap, the Jackknife, and Cross-Validation,” The Am. Statistician, vol. 37, pp.36-48, 1983.
[37] N.R. Draper and H. Smith, Applied Regression Analysis, third ed. John Wiley and Sons, 1998.
29 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool