Visual Languages and Human-Centric Computing, IEEE Symposium on (2005)
Sept. 20, 2005 to Sept. 24, 2005
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/VLHCC.2005.44
Joseph Lawrance , Microsoft Corporation and Oregon State University
Steven Clarke , Microsoft Corporation
Margaret Burnett , Oregon State University
Gregg Rothermel , University of Nebraska-Lincoln
Despite years of availability of testing tools, professional software developers still seem to need better support to determine the effectiveness of their tests. Without improvements in this area, inadequate testing of software seems likely to remain a major problem. To address this problem, industry and researchers have proposed systems that visualize "testedness" for end-user and professional developers. Empirical studies of such systems for end-user programmers have begun to show success at helping end users write more effective tests. Encouraged by this research, we examined the effect that code coverage visualizations have on the effectiveness of test cases that professional software developers write. This paper presents the results of an empirical study conducted using code coverage visualizations found in a commercially available programming environment. Our results reveal how this kind of code coverage visualization impacts test effectiveness, and provide insights into the strategies developers use to test code.
M. Burnett, J. Lawrance, G. Rothermel and S. Clarke, "How Well Do Professional Developers Test with Code Coverage Visualizations? An Empirical Study," Proceedings. 2005 IEEE Symposium on Visual Languages and Human-Centric Computing(VLHCC), Dallas, TX, USA, 2005, pp. 53-60.