This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Studying the Fault-Detection Effectiveness of GUI Test Cases for Rapidly Evolving Software
October 2005 (vol. 31 no. 10)
pp. 884-896
Atif M. Memon, IEEE Computer Society
Qing Xie, IEEE
Software is increasingly being developed/maintained by multiple, often geographically distributed developers working concurrently. Consequently, rapid-feedback-based quality assurance mechanisms such as daily builds and smoke regression tests, which help to detect and eliminate defects early during software development and maintenance, have become important. This paper addresses a major weakness of current smoke regression testing techniques, i.e., their inability to automatically (re)test graphical user interfaces (GUIs). Several contributions are made to the area of GUI smoke testing. First, the requirements for GUI smoke testing are identified and a GUI smoke test is formally defined as a specialized sequence of events. Second, a GUI smoke regression testing process called Daily Automated Regression Tester (DART) that automates GUI smoke testing is presented. Third, the interplay between several characteristics of GUI smoke test suites including their size, fault detection ability, and test oracles is empirically studied. The results show that: 1) the entire smoke testing process is feasible in terms of execution time, storage space, and manual effort, 2) smoke tests cannot cover certain parts of the application code, 3) having comprehensive test oracles may make up for not having long smoke test cases, and 4) using certain oracles can make up for not having large smoke test suites.

[1] A.M. Memon and Q. Xie, “Empirical Evaluation of the Fault-Detection Effectiveness of Smoke Regression Test Cases for GUI-Based Software,” Proc. Int'l Conf. Software Maintenance 2004 (ICSM '04), pp. 8-17, Sept. 2004.
[2] A. Memon, A. Nagarajan, and Q. Xie, “Automating Regression Testing for Evolving GUI Software,” J. Software Maintenance and Evolution: Research and Practice, vol. 17, no. 1, pp. 27-64, 2005.
[3] A.M. Memon, I. Banerjee, and A. Nagarajan, “GUI Ripping: Reverse Engineering of Graphical User Interfaces for Testing,” Proc. 10th Working Conf. Reverse Eng., pp. 260-269, Nov. 2003.
[4] A.M. Memon, M.E. Pollack, and M.L. Soffa, “Hierarchical GUI Test Case Generation Using Automated Planning,” IEEE Trans. Software Eng., vol. 27, no. 2, pp. 144-155, Feb. 2001.
[5] A.M. Memon, I. Banerjee, and A. Nagarajan, “What Test Oracle Should I Use for Effective GUI Testing?,” Proc. IEEE Int'l Conf. Automated Software Eng., pp. 164-173, Oct. 2003.
[6] E.-A. Karlsson, L.-G. Andersson, and P. Leion, “Daily Build and Feature Development in Large Distributed Projects,” Proc. 22nd Int'l Conf. Software Eng., pp. 649-658, 2000.
[7] S. McConnell, “Best Practices: Daily Build and Smoke Test,” IEEE Software, vol. 13, no. 4, pp. 143-144, July 1996.
[8] K. Olsson, “Daily Build— The Best of Both Worlds: Rapid Development and Control,” technical report, Swedish Eng. Industries, 1999.
[9] J. Robbins, Debugging Applications. Microsoft Press, 2000.
[10] T.J. Halloran and W.L. Scherlis, “High Quality and Open Source Software Practices,” Meeting Challenges and Surviving Success: Second Workshop Open Source Software Eng., May 2002.
[11] B. Marick, “When Should a Test Be Automated?” Proc. 11th Int'l Software/Internet Quality Week, May 1998.
[12] L. Crispin, T. House, and C. Wade, “The Need for Speed: Automating Acceptance Testing in an Extreme Programming Environment,” Proc. Second Int'l Conf. eXtreme Programming and Flexible Processes in Software Eng., pp. 96-104, 2001.
[13] “WINE Daily Builds,” 2003, http:/wine.dataparty.no/.
[14] “Mozilla,” 2003, http:/ftp.mozilla.org/.
[15] “Open WebMail,” 2003, http:/openwebmail.org/.
[16] “Cruise Control,” 2003, http:/cruisecontrol.sourceforge.net/.
[17] “FAST C++ Compilation— IcrediBuild by Xoreax Software,” 2003, http://www.xoreax.commain.htm.
[18] “Positive-g-Daily Build Product Information— Mozilla,” 2003, http://positive-g.comdailybuild/.
[19] “Kinook Software— Automate Software Builds with Visual Build Pro,” 2003, http:/www.visualbuild.com/.
[20] N. Baran, “Load Testing Web Sites,” Dr. Dobb's J. Software Tools, vol. 26, no. 3, pp. 112, 114, 116, 118-119, Mar. 2001.
[21] S. Ellis, D. Johnson, M. Schmit, J. Jones, S. Cooke, and K. Granroth, “Letters: Open Source Cobol; Setting the Debian Record Straight; Back to Basics; Load Testing Web Sites; Open Source Hat Tricks; KDE Insider,” Dr. Dobb's J. Software Tools, vol. 26, no. 7, pp. 10-12, July 2001.
[22] J. Weirich, “Using Perl to Check Web Links,” Linux J., vol. 36, Apr. 1997
[23] H. Berghel, “Using the WWW Test Pattern to Check HTML Client Compliance,” Computer, vol. 28, no. 9, pp. 63-65, Sept. 1995.
[24] B. Marick, “Bypassing the GUI,” Software Testing and Quality Eng. Magazine, pp. 41-47, Sept. 2002.
[25] M. Finsterwalder, “Automating Acceptance Tests for GUI Applications in an Extreme Programming Environment,” Proc. Second Int'l Conf. eXtreme Programming and Flexible Processes in Software Eng., pp. 114-117, May 2001.
[26] L. White, H. AlMezen, and N. Alzeidi, “User-Based Testing of GUI Sequences and Their Interactions,” Proc. 12th Int'l Symp. Software Reliability Eng., pp. 54-63, 2001.
[27] “JUnit, Testing Resources for Extreme Programming,” http://junit.org/news/extension/guiindex.htm , 2004.
[28] J.H. Hicinbothom and W.W. Zachary, “A Tool for Automatically Generating Transcripts of Human-Computer Interaction,” Proc. Human Factors and Ergonomics Society 37th Ann. Meeting, p. 1042, 1993.
[29] “Capture-Replay Tool,” 2003, http:/soft.com.
[30] “Mercury Interactive WinRunner,” 2003, http://www. mercuryinteractive.com/products winrunner.
[31] “Abbot Java GUI Test Framework,” 2003, http:/abbot.sourceforge. net.
[32] “Rational Robot,” 2003, http://www.rational.com.ar/tools robot.html .
[33] A.M. Memon, “A Comprehensive Framework for Testing Graphical User Interfaces,” PhD thesis, Dept. of Computer Science, Univ. of Pittsburgh, July 2001.
[34] “Java Test Coverage Analyzer,” http://www.codework.com/JCoverproduct.html , 2004.
[35] J. Su and P.R. Ritter, “Experience in Testing the Motif Interface,” IEEE Software, vol. 8, no. 2, pp. 26-33, Mar. 1991.
[36] P.A. Vogel, “An Integrated General Purpose Automated Test Environment,” Proc. Int'l Symp. Software Testing and Analysis, T. Ostrand and E. Weyuker, eds., pp. 61-69, June 1993.
[37] J.D. Jobson, Applied Multivariate Data Analysis Volume 1: Regression and Experimental Design. Springer, 1991
[38] H. Sahai and M. Ageel, The Analysis of Variance: Fixed, Random and Mixed Models. Birkhauser, 2000.

Index Terms:
Index Terms- Smoke testing, GUI testing, test oracles, empirical studies, regression testing.
Citation:
Atif M. Memon, Qing Xie, "Studying the Fault-Detection Effectiveness of GUI Test Cases for Rapidly Evolving Software," IEEE Transactions on Software Engineering, vol. 31, no. 10, pp. 884-896, Oct. 2005, doi:10.1109/TSE.2005.117
Usage of this product signifies your acceptance of the Terms of Use.