The Community for Technology Leaders
RSS Icon
Issue No.06 - November/December (1999 vol.19)
pp: 51-59
<p>The ever-increasing power of computers and hardware rendering systems has, to date, primarily motivated the creation of visually rich and perceptually realistic virtual environment (VE) applications. Comparatively very little effort has been expended on the user interaction components of VEs. As a result, VE user interfaces are often poorly designed and are rarely evaluated with users. Although usability engineering is a newly emerging facet of VE development, user-centered design and evaluation in VEs as a practice still lags far behind what is needed.</p> <p>In this article we present a structured, iterative methodology for the user-centered design and evaluation of VE user interaction. Figure 1 illustrates our basic technique: we recommend performing (1) user task analysis, followed by (2) expert guidelines-based evaluation, followed by (3) formative user-centered evaluation, and finally by (4) summative comparative evaluation. In this article we first give some motivation and background for our methodology, and then we describe each of these techniques in some detail. We then discuss how we applied these techniques to a real-world battlefield visualization virtual environment, and finally discuss why this approach provides a cost-effective strategy for assessing and iteratively improving user interaction in VEs.</p>
Joseph L. Gabbard, Deborah Hix, J. Edward Swan, "User-Centered Design and Evaluation of Virtual Environments", IEEE Computer Graphics and Applications, vol.19, no. 6, pp. 51-59, November/December 1999, doi:10.1109/38.799740
1. J. Nielsen, Usability Engineering, Academic Press, New York, 1993.
2. D.A. Norman and S.W. Draper, eds., User Centered System Design, Lawrence Erlbaum Associates, Hillsdale, N.J., 1986.
3. D. Hix and H. Hartson, Developing User Interfaces: Ensuring Usability through Product&Process, John Wiley&Sons, New York, 1993.
4. J.T. Hackos and J.D. Redish, User and Task Analysis for Interface Design. pp. 258–259, New York: John Wiley&Sons, ch. 9, 1998.
5. J. Nielsen, "Heuristic Evaluation," in Usability Inspection Methods, John Wiley and Sons, New York, 1994, pp. 25-62.
6. J. Nielsen and R. Molich, "Heuristic Evaluation of User Interfaces," Proc. ACM CHI 90 Conf., ACM Press, New York, April 1990, pp. 249-256.
7. J.L. Gabbard, A Taxonomy of Usability Characteristics in Virtual Environments, master's thesis, Dept. of Computer Science and Applications, Virginia Polytechnic Institute and State University, 1998, vt.htm.
8. E.M. del Galdo et al., "An Evaluation of Critical Incidents for Software Documentation Design," Proc. 30th Annual Human Factors and Ergonomics Society Conf., Human Factors and Ergonomics Society, Santa Monica, Calif., 1986, pp. 19-23.
9. D. Hix et al., "User-Centered Design and Evaluation of a Real-Time Battlefield Visualization Virtual Environment," Proc. IEEE Virtual Reality 99, IEEE Computer Society Press, Los Alamitos, Calif., 1999, pp. 96-103.
10. J.L. Gabbard et al., "Usability Evaluation Techniques: A Novel Method for Assessing the Usability of an Immersive Medical VE," Proc. Virtual Worlds and Simulation Conf. (VWSIM 99), Society for Computer Simulation Int'l, San Diego, Calif., 1999, pp. 165-170.
11. H.W. Desurvire, J.M. Kondziela, and M.E. Atwood, "What Is Gained and Lost When Using Evaluation Methods Other than Empirical Testing," Proc. Conf. Human-Computer Interaction, Cambridge University Press, Cambridge, UK, 1992, pp. 89-102.
12. J. Durbin et al., "Battlefield Visualization on the Responsive Workbench," Proc. IEEE Visualization 98, ACM Press, New York, Oct. 1998, pp. 463-466.
38 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool