This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Applying Concept Analysis to User-Session-Based Testing of Web Applications
October 2007 (vol. 33 no. 10)
pp. 643-658
Sreedevi Sampath, IEEE Computer Society
Lori Pollock, IEEE Computer Society
Amie Souter Greenwald, IEEE Computer Society
The continuous use of the web for daily operations by businesses, consumers, and the government has created a great demand for reliable web applications. One promising approach to testing the functionality of web applications leverages user-session data collected by web servers. User-session-based testing automatically generates test cases based on real user profiles. The key contribution of this paper is the application of concept analysis for clustering user sessions and a set of heuristics for test case selection. Existing incremental concept analysis algorithms are exploited to avoid collecting and maintaining large user-session data sets and thus to provide scalability. We have completely automated the process from user session collection and test suite reduction through test case replay. Our incremental test suite update algorithm coupled with our experimental study indicate that concept analysis provides a promising means for incrementally updating reduced test suites in response to newly captured user sessions with little loss in fault detection capability and program coverage.

[1] R. Hower, “Web Site Test Tools and Site Management Tools,” http://www.softwareqatest.comqatweb1.html , 2007.
[2] “Jakarta Cactus,” http://jakarta.apache.orgcactus/, 2007.
[3] “JUnit,” http:/www.junit.org, 2007.
[4] “HttpUnit,” http:/httpunit.sourceforge.net, 2007.
[5] C.-H. Liu, D.C. Kung, and P. Hsia, “Object-Based Data Flow Testing of Web Applications,” Proc. First Asia-Pacific Conf. Quality Software, pp. 7-16, 2000.
[6] G. Di Lucca, A. Fasolino, F. Faralli, and U.D. Carlini, “Testing Web Applications,” Proc. 18th IEEE Int'l Conf. Software Maintenance, pp.310-319, 2002.
[7] F. Ricca and P. Tonella, “Analysis and Testing of Web Applications,” Proc. 23rd Int'l Conf. Software Eng., pp. 25-34, 2001.
[8] A. Andrews, J. Offutt, and R. Alexander, “Testing Web Applications by Modeling with FSMs,” Software and Systems Modeling, vol. 4, no. 3, pp. 326-345, July 2005.
[9] S. Elbaum, G. Rothermel, S. Karre, and M. Fisher II, “Leveraging User Session Data to Support Web Application Testing,” IEEE Trans. Software Eng., vol. 31, no. 3, pp. 187-202, Mar. 2005.
[10] “Rational Robot,” http://www.ibm.com/software/awdtools/ tester robot/, 2007.
[11] “Parasoft WebKing,” http:/www.parasoft.com, 2007
[12] M.J. Harrold, R. Gupta, and M.L. Soffa, “A Methodology for Controlling the Size of a Test Suite,” ACM Trans. Software Eng. and Methodology, vol. 2, no. 3, pp. 270-285, July 1993.
[13] I. Jacobson, “The Use-Case Construct in Object-Oriented Software Engineering,” Scenario-Based Design: Envisioning Work and Technology in System Development, J.M. Carroll, ed., 1995.
[14] S. Sampath, “Cost-Effective Techniques for User-Session-Based Testing of Web Applications,” PhD dissertation, Univ. of Delaware, 2006.
[15] S. Sprenkle, E. Gibson, S. Sampath, and L. Pollock, “Automated Replay and Failure Detection for Web Applications,” Proc. 20th Int'l Conf. Automated Software Eng., pp. 253-262, 2005.
[16] S. Sampath, V. Mihaylov, A. Souter, and L. Pollock, “A Scalable Approach to User-Session Based Testing of Web Applications through Concept Analysis,” Proc. 19th Int'l Conf. Automated Software Eng., pp. 132-141, 2004.
[17] E. Kirda, M. Jazayeri, C. Kerer, and M. Schranz, “Experiences in Engineering Flexible Web Service,” IEEE MultiMedia, vol. 8, no. 1, pp. 58-65, Jan.-Mar. 2001.
[18] D.C. Kung, C.-H. Liu, and P. Hsia, “An Object-Oriented Web Test Model for Testing Web Applications,” Proc. First Asia-Pacific Conf. Quality Software, pp. 111-120, 2000.
[19] J. Sant, A. Souter, and L. Greenwald, “An Exploration of Statistical Models of Automated Test Case Generation,” Proc. Third Int'l Workshop Dynamic Analysis, May 2005.
[20] J. Offutt and W. Xu, “Generating Test Cases for Web Services Using Data Perturbation,” Proc. Workshop Testing, Analysis, and Verification of Web Services, 2004.
[21] C. Fu, B. Ryder, A. Milanova, and D. Wonnacott, “Testing of Java Web Services for Robustness,” Proc. ACM SIGSOFT Int'l Symp. Software Testing and Analysis, pp. 23-34, 2004.
[22] Y. Deng, P. Frankl, and J. Wang, “Testing Web Database Applications,” SIGSOFT Software Eng. Notes, vol. 29, no. 5, pp. 1-10, 2004.
[23] M. Benedikt, J. Freire, and P. Godefroid, “VeriWeb: Automatically Testing Dynamic Web Sites,” Proc. 11th Int'l Conf. World Wide Web, May 2002.
[24] T.Y. Chen and M.F. Lau, “Dividing Strategies for the Optimization of a Test Suite,” Information Processing Letters, vol. 60, no. 3, pp.135-141, Mar. 1996.
[25] J. Offutt, J. Pan, and J. Voas, “Procedures for Reducing the Size of Coverage-Based Test Sets,” Proc. 12th Int'l Conf. Testing Computer Software, pp. 111-123, 1995.
[26] J.A. Jones and M.J. Harrold, “Test Suite Reduction and Prioritization for Modified Condition/Decision Coverage,” IEEE Trans. Software Eng., vol. 29, no. 3, pp. 195-209, Mar. 2003.
[27] D. Jeffrey and N. Gupta, “Test Suite Reduction with Selective Redundancy,” Proc. 21st IEEE Int'l Conf. Software Maintenance, pp.549-558, 2005.
[28] S.M. Master and A. Memon, “Call Stack Coverage for Test Suite Reduction,” Proc. 21st IEEE Int'l Conf. Software Maintenance, pp.539-548, 2005.
[29] D. Leon, W. Masri, and A. Podgurski, “An Empirical Evaluation of Test Case Filtering Techniques Based on Exercising Complex Information Flows,” Proc. 27th Int'l Conf. Software Eng., pp. 412-421, 2005.
[30] D. Leon and A. Podgurski, “A Comparison of Coverage-Based and Distribution-Based Techniques for Filtering and Prioritizing Test Cases,” Proc. 14th Int'l Symp. Software Reliability Eng., pp. 442-453, 2003.
[31] M. Harder, J. Mellen, and M.D. Ernst, “Improving Test Suites via Operational Abstraction,” Proc. 25th Int'l Conf. Software Eng., pp.60-71, 2003.
[32] G. Birkhoff, Lattice Theory, vol. 5. American Math. Soc. Colloquium Publications, 1940.
[33] M. Krone and G. Snelting, “On the Inference of Configuration Structures from Source Code,” Proc. 16th Int'l Conf. Software Eng., pp. 49-57, 1994.
[34] G. Snelting and F. Tip, “Reengineering Class Hierarchies Using Concept Analysis,” Proc. Sixth ACM SIGSOFT Int'l Symp. Foundations of Software Eng., pp. 99-110, 1998.
[35] G. Ammons, D. Mandelin, and R. Bodik, “Debugging Temporal Specifications with Concept Analysis,” Proc. ACM SIGPLAN Conf. Programming Language Design and Implementation, pp.182-195, 2003.
[36] T. Kuipers and L. Moonen, “Types and Concept Analysis for Legacy Systems,” Proc. Eighth Int'l Workshop Program Comprehension, pp. 221-230, 2000.
[37] C. Lindig and G. Snelting, “Assessing Modular Structure of Legacy Code Based on Mathematical Concept Analysis,” Proc. 19th Int'l Conf. Software Eng., pp. 349-359, 1997.
[38] P. Tonella, “Concept Analysis for Module Restructuring,” IEEE Trans. Software Eng., vol. 27, no. 4, pp. 351-363, Apr. 2001.
[39] M. Siff and T. Reps, “Identifying Modules via Concept Analysis,” Proc. 13th Int'l Conf. Software Maintenance, pp. 170-179, 1997.
[40] T. Eisenbarth, R. Koschke, and D. Simon, “Locating Features in Source Code,” IEEE Trans. Software Eng., vol. 29, no. 3, pp. 210-224, Mar. 2003.
[41] T. Ball, “The Concept of Dynamic Analysis,” Proc. Seventh ACM SIGSOFT Int'l Symp. Foundations of Software Eng., pp. 216-234, 1999.
[42] S. Tallam and N. Gupta, “A Concept Analysis Inspired Greedy Algorithm for Test Suite Minimization,” Proc. Sixth ACM SIGPLAN-SIGSOFT Workshop Program Analysis for Software Tools and Eng., pp. 35-42, 2005.
[43] A. Podgurski, W. Masri, Y. McCleese, F.G. Wolff, and C. Yang, “Estimation of Software Reliability by Stratified Sampling,” ACM Trans. Software Eng. and Methodology, vol. 8, no. 3, pp. 263-283, 1999.
[44] W. Dickinson, D. Leon, and A. Podgurski, “Pursuing Failure: The Distribution of Program Failures in a Profile Space,” Proc. Ninth ACM SIGSOFT Int'l Symp. Foundations of Software Eng., pp. 246-255, 2001.
[45] R. Godin, R. Missaoui, and H. Alaoui, “Incremental Concept Formation Algorithms Based on Galois (Concept) Lattices,” Computational Intelligence, vol. 11, no. 2, pp. 246-267, 1995.
[46] C. Lindig, “Christian Lindig $>$ Software $>$ Concepts,” http://www.st.cs.uni-sb.de/~lindig/srcconcepts.html , 2007.
[47] “Open Source Web Applications with Source Code,” http:/www.gotocode.com, 2007.
[48] “Cenqua: Clover,” http://www.cenqua.comclover/, 2007.
[49] “HTTPClient v0.3-3,” http://www.innovation.ch/javaHTTP Client /, 2007.
[50] J.H. Andrews, L.C. Briand, and Y. Labiche, “Is Mutation an Appropriate Tool for Testing Experiments,” Proc. 27th Int'l Conf. Software Eng., pp. 402-411, 2005.
[51] S. Sampath, S. Sprenkle, E. Gibson, and L. Pollock, “Web Application Testing with Customized Test Requirements—An Experimental Comparison Study,” Proc. 17th Int'l Symp. Software Reliability Eng., pp. 266-278, Nov. 2006.
[52] S. Sprenkle, S. Sampath, E. Gibson, L. Pollock, and A. Souter, “An Empirical Comparison of Test Suite Reduction Techniques for User-Session-Based Testing of Web Applications,” Proc. 21st Int'l Conf. Software Maintenance, pp. 587-596, 2005.

Index Terms:
Software testing, Web applications, User-session-based testing, Test suite reduction, Concept analysis, Incremental test suite reduction
Citation:
Sreedevi Sampath, Sara Sprenkle, Emily Gibson, Lori Pollock, Amie Souter Greenwald, "Applying Concept Analysis to User-Session-Based Testing of Web Applications," IEEE Transactions on Software Engineering, vol. 33, no. 10, pp. 643-658, Oct. 2007, doi:10.1109/TSE.2007.70723
Usage of this product signifies your acceptance of the Terms of Use.