Quality Software, International Conference on (2009)
Aug. 24, 2009 to Aug. 25, 2009
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/QSIC.2009.45
This paper is aimed at evaluating the ability of novice analysts to understand models specified using a RUP extension for modeling requirements. The evaluation is guided by a theoretical model for IS design methods, the Method Evaluation Model (MEM). In this work, we present the empirical testing of the MEM in the evaluation of a RUP extension for modeling requirements. The testing was conducted through an experiment using 39 novice users. The evaluation’s primary goal was to test the users’ ability to understand requirements models. The results provide a strong indication that our RUP extension is indeed both ease to use and useful and there is an intention to use the method in the future.
Requirements Engineering, Method Evaluation, Empirical Software Engineering
Silvia Abrahão, Emilio Insfran, Mario Piattini, Marcela Genero, José A. Carsí, "Evaluating the Ability of Novice Analysts to Understand Requirements Models", Quality Software, International Conference on, vol. 00, no. , pp. 290-295, 2009, doi:10.1109/QSIC.2009.45