2015 IEEE/ACM 12th Working Conference on Mining Software Repositories (MSR) (2015)
May 16, 2015 to May 17, 2015
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/MSR.2015.31
Ripon K. Saha , Univ. of Texas at Austin, Austin, TX, USA
Julia Lawall , LIP6, Sorbonne Univ., Paris, France
Sarfraz Khurshid , Univ. of Texas at Austin, Austin, TX, USA
Dewayne E. Perry , Univ. of Texas at Austin, Austin, TX, USA
Understanding the severity of reported bugs is important in both research and practice. In particular, a number of recently proposed mining-based software engineering techniques predict bug severity, bug report quality, and bug-fix time, according to this information. Many bug tracking systems provide a field "severity" offering options such as "severe", "normal", and "minor", with "normal" as the default. However, there is a widespread perception that for many bug reports the label "normal" may not reflect the actual severity, because reporters may overlook setting the severity or may not feel confident enough to do so. In many cases, researchers ignore "normal" bug reports, and thus overlook a large percentage of the reports provided. On the other hand, treating them all together risks mixing reports that have very diverse properties. In this study, we investigate the extent to which "normal" bug reports actually have the "normal" severity. We find that many "normal" bug reports in practice are not normal. Furthermore, this misclassification can have a significant impact on the accuracy of mining-based tools and studies that rely on bug report severity information.
Computer bugs, Software, Software engineering, Data mining, Training, Accuracy, Noise
R. K. Saha, J. Lawall, S. Khurshid and D. E. Perry, "Are These Bugs Really "Normal"?," 2015 IEEE/ACM 12th Working Conference on Mining Software Repositories (MSR), Florence, Italy, 2015, pp. 258-268.