Issue No. 01 - January/February (2010 vol. 27)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/MS.2009.154
Chris Verhoef , VU University, Amsterdam
J. Laurenz Eveleens , VU University, Amsterdam
In 1994, Standish published the Chaos report that showed a shocking 16 percent project success. This and renewed figures by Standish are often used to indicate that project management of application software development is in trouble. However, Standish's definitions have four major problems. First, they're misleading because they're based solely on estimation accuracy of cost, time, and functionality. Second, their estimation accuracy measure is one-sided, leading to unrealistic success rates. Third, steering on their definitions perverts good estimation practice. Fourth, the resulting figures are meaningless because they average numbers with an unknown bias, numbers that are introduced by different underlying estimation processes. The authors of this article applied Standish's definitions to their own extensive data consisting of 5,457 forecasts of 1,211 real-world projects, totaling hundreds of millions of Euros. The Standish figures didn't reflect the reality of the case studies at all.
chaos report, standish group, forecasting, project success, software, software engineering
Chris Verhoef, J. Laurenz Eveleens, "The Rise and Fall of the Chaos Report Figures", IEEE Software, vol. 27, no. , pp. 30-36, January/February 2010, doi:10.1109/MS.2009.154