This Article 
 Bibliographic References 
 Add to: 
eSeeTrack—Visualizing Sequential Fixation Patterns
November/December 2010 (vol. 16 no. 6)
pp. 953-962
Hoi Ying Tsang, University of Victoria
Melanie Tory, University of Victoria
Colin Swindells, University of Victoria, Locarna Systems
We introduce eSeeTrack, an eye-tracking visualization prototype that facilitates exploration and comparison of sequential gaze orderings in a static or a dynamic scene. It extends current eye-tracking data visualizations by extracting patterns of sequential gaze orderings, displaying these patterns in a way that does not depend on the number of fixations on a scene, and enabling users to compare patterns from two or more sets of eye-gaze data. Extracting such patterns was very difficult with previous visualization techniques. eSeeTrack combines a timeline and a tree-structured visual representation to embody three aspects of eye-tracking data that users are interested in: duration, frequency and orderings of fixations. We demonstrate the usefulness of eSeeTrack via two case studies on surgical simulation and retail store chain data. We found that eSeeTrack allows ordering of fixations to be rapidly queried, explored and compared. Furthermore, our tool provides an effective and efficient mechanism to determine pattern outliers. This approach can be effective for behavior analysis in a variety of domains that are described at the end of this paper.

[1] A. Aula, P. Majaranta, and K.-J. Räihä, "Eye-Tracking Reveals the Personal Style for Search Result Evaluation", Human-Computer Interaction – INTERACT 2005, M. F. Costabile, and F. Paternò eds., Springer, Berlin, pp. 1058–1061, 2005.
[2] J. Blaas, C.P. Botha, R.S. Laramee, "Smooth Graphs for Visual Exploration of High-Order State Transitions", , IEEE Trans. Visualization and Computer Graphics, 15 (6): 969–976, Nov.-Dec. 2009.
[3] ColorBrewer. http :/, 2010.
[4] A.T. Duchowski, "A Breadth-First survey of eye-tracking applications", Behavior Research Methods, Instruments, and Computers, 34 (4): 455–470, Nov. 2002.
[5] A.T. Duchowski, J. Driver, S. Jolaoso, B.N. Ramey, and A. Robbins, "Scanpath Comparison Revisited", Proc. Eye Tracking Research & Applications, pp. 219–226, 2010.
[6] Y. Egusa, M. Takaku, H. Terai, H. Saito, N. Kando, M. Miwa, "Visualization of User Eye Movements for Search Result Pages", Proc.2nd Intl. Workshop on Evaluating Information Access, pp. 42–46, Dec. 2008.
[7] M. Feusner and B. Lukoff, "Testing for statistically significant differences between groups of scan patterns", Proc. Eye Tracking Research & Applications, pp. 43–46, 2008.
[8] J.H. Goldberg and J.I. Helfman, "Scanpath Clustering and Aggregation", Proc. Eye Tracking Research & Applications, pp. 227–234, 2010.
[9] J.H. Goldberg and J.I. Helfman, "Visual Scanpath Representation", Proc. Eye Tracking Research & Applications, pp. 203–210, 2010.
[10] J.H. Goldberg and X.P, Kotval, "Computer interface evaluation using eye movements: methods and constructs", Intl Journal of Industrial Ergonomics, 24 (6): 631–645, Oct. 1999.
[11] R.I. Hammoud and J.B. Mulligan, "Introduction to Eye Monitoring", Passive Eye Monitoring, Springer, Berlin, pp. 1–19, 2008.
[12] M. Hayhoe and D. Ballard, "Eye movements in natural behavior", TRENDS in Cognitive Sciences, 9 (4): 188–94, Apr. 2005.
[13] H. Hembrooke, M. Feusner, and G. Gay, "Averaging Scan Patterns and What They Can Tell Us", Proc. Eye Tracking Research & Applications, p.41, 2006.
[14] K. Itoh, J.P. Hansen, and F.R. Nielsen, "Cognitive Modeling of Ship Navigation Based on Protocol and Eye-Movement Analysis", Le Travail Humain, 61 (2): pp. 99–127, 1998.
[15] R.J.K. Jacob and K.S. Karn, "Eye Tracking in Human-Computer Interaction and Usability Research: Ready to Deliver the Promises", The Mind's Eye: Cognitive and Applied Aspect of Eye-Movement Research, J. Hyona, R. Radach, and H. Deubel eds., Elsevier, Amsterdam, pp. 573–605, 2003.
[16] H. Jarodzka, and K. Holmqvist, "A Vector-based, Multidimensional Scanpath Similarity Measure", Proc. Eye Tracking Research & Applications, pp. 211–218, 2010.
[17] C. Lankford, "GazeTracker: Software Designed to Facilitate Eye Movement Analysis", Proc. Eye tracking Research & Applications, pp. 51–55, 2000.
[18] B. Law, M.S. Atkins, A.E. Kirkpatrick, A.J. Lomax, "Eye gaze patterns differentiate novice and experts in a virtual laparoscopic surgery training environment", Proc. Eye Tracking Research & Applications, pp.41–48, 2004.
[19] J. Nielsen and K. Pernice, Eyetracking Web Usability, New Riders Press, Berkeley, p. 9–12, 2010.
[20] M. Pomplum, H. Ritter, B. Velichkovsky, "Disambiguating Complex Visual Information: Towards Communication of Personal Views of a Scene", Perception, 25: 931–948, 1996.
[21] V. Ponsoda, D. Scott, and J.M. Findlay, "A probability vector and transition matrix analysis of eye movements during visual search", ActaPsychologica, Elsevier, Amsterdam, 88 (2): 167–185, 1995.
[22] C.M. Privitera and L. Stark, "Algorithms for Defining Visual Regions of Interest: Comparison with Eye-Fixations", IEEE Trans. Pattern Analysis and Machine Intelligence, 22 (9): 970–982, 2000.
[23] K.-J. Räihä, A. Aula, P. Majaranta, H. Rantala, and K. Koivunen, "Static Visualization of Temporal Eye-Tracking Data", Human-Computer Interaction – INTERACT 2005, M. F. Costabile, and F. Paternò eds., Springer, Berlin, pp. 946–949, 2005.
[24] D.C. Richardson, and M.J. Spivey, "Eye Tracking: Characteristics and Methods", Encyclopedia of Biomaterials and Biomedical Engineering, G. Wnek, and G. L. Bowlin eds., Informa Healthcare Publisher, New York, pp. 1028–1032, 2004.
[25] O. Špakov, and D. Miniotas, "Visualization of Eye Gaze Data using Heat Maps", Electronics and Electrical Engineering, 2:55–58, 2007.
[26] G. Tien, M.S. Atkins, B. Zheng, and C. Swindells, "Measuring Situation Awareness of Surgeons in Laparoscopic Training", Proc. Eye Tracking Research & Applications, pp. 149–152, 2010.
[27] M. Tory, M.S. Atkins, A.E. Kirkpatrick, M. Nicolauo, and G.-Z. Yang, "Eyegaze Area-of-Interest Analysis of 2D and 3D Combination Displays", IEEE Visualization, pp. 519–526, Oct. 2005.
[28] E. Vatikiotis-Bateson, I.-M. Eigstic, S. Yano, and K. G. Munhall, "Eye movement of perceivers during audiovisual speech perception", Perception &Psychophysics, 60 (6): 926–940, Aug. 1998.
[29] K. Vrotsou, J. Johansson, and M. Cooper, "ActiviTree: Interactive Visual Exploration of Sequences in Event-Based Data Using Graph Similarity", IEEE Trans. Visualization and Computer Graphics, 15 (6): 945–952, Nov.-Dec. 2009.
[30] C. Ware, Information Visualization: Perception for Design (Morgan Kaufmann Integrative Technologies Series), 2nd edition, Morgan Kaufmann Publishers, California, pp. 123–126, 2004.
[31] M. Wattenberg and F.B. Viégas, "The Word Tree, an Interactive Visual Concordance", IEEE Trans. Visualization and Computer Graphics, 14 (6): 1221–1228, Nov.-Dec. 2008.
[32] J.M. West, A.R. Haake, E.P. Rozanski, and K.S. Karn, "eyePatterns: Software for Identifying Patterns and Similarities Across Fixation Sequences", Proc. Eye Tracking Research & Applications, pp. 149–154, 2006.
[33] D.S. Wooding, "Fixation Maps: Quantifying Eye-movement Traces", Proc. Eye tracking Research & Applications, pp. 31–36, 2002.

Index Terms:
eye-tracking, fixation pattern, timeline, tree-structured visualization
Hoi Ying Tsang, Melanie Tory, Colin Swindells, "eSeeTrack—Visualizing Sequential Fixation Patterns," IEEE Transactions on Visualization and Computer Graphics, vol. 16, no. 6, pp. 953-962, Nov.-Dec. 2010, doi:10.1109/TVCG.2010.149
Usage of this product signifies your acceptance of the Terms of Use.