The Community for Technology Leaders
Green Image
<p>The author analyzes and models the software development process, and presents field experience for large distributed systems. Defect removal is shown to be the bottleneck in achieving the appropriate quality level before system deployment in the field. The time to defect detection, the defect repair time and a factor reflecting the introduction of new defects due to imperfect defect repair are some of the constants in the laws governing defect removal. Test coverage is a measure of defect removal effectiveness. A birth-death mathematical model based on these constants is developed and used to model field failure report data. The birth-death model is contrasted with a more classical decreasing exponential model. Both models indicate that defect removal is not a cost-effective way to achieve quality. As a result of the long latency of software defects in a system, defect prevention is suggested to be a far more practical solution to quality than defect removal.</p>
reliability analysis; defect data modeling; software development; large distributed systems; bottleneck; defect removal; birth-death mathematical model; field failure report data; quality; distributed processing; large-scale systems; program testing; software reliability.

Y. Levendel, "Reliability Analysis of Large Software Systems: Defect Data Modeling," in IEEE Transactions on Software Engineering, vol. 16, no. , pp. 141-152, 1990.
93 ms
(Ver 3.3 (11022016))