The Community for Technology Leaders
Green Image
<p>Applying equal testing and verification effort to all parts of a software system is not very efficient, especially when resources are tight. Therefore, one needs to low/high fault frequency components so that testing/verification effort can be concentrated where needed. Such a strategy is expected to detect more faults and thus improve the resulting reliability of the overall system. The authors present the optimized set reduction approach for constructing such models, which is intended to fulfill specific software engineering needs. The approach to classification is to measure the software system and build multivariate stochastic models for predicting high-risk system components. Experimental results obtained by classifying Ada components into two classes (is, or is not likely to generate faults during system and acceptance rest) are presented. The accuracy of the model and the insights it provides into the error-making process are evaluated.</p>
high-risk software components; testing effort; verification effort; optimized set reduction approach; multivariate stochastic model; classifying Ada components; error-making process; program testing; program verification; software reliability

V. Brasili, C. Hetmanski and L. Briand, "Developing Interpretable Models with Optimized set Reduction for Identifying High-Risk Software Components," in IEEE Transactions on Software Engineering, vol. 19, no. , pp. 1028-1044, 1993.
93 ms
(Ver 3.3 (11022016))