The Community for Technology Leaders
RSS Icon
Issue No.05 - May (2011 vol.17)
pp: 570-583
Youn-ah Kang , Georgia Institute of Technology, Atlanta
Carsten Görg , Georgia Institute of Technology, Atlanta
John Stasko , Georgia Institute of Technology, Atlanta
Despite the growing number of systems providing visual analytic support for investigative analysis, few empirical studies of the potential benefits of such systems have been conducted, particularly controlled, comparative evaluations. Determining how such systems foster insight and sensemaking is important for their continued growth and study, however. Furthermore, studies that identify how people use such systems and why they benefit (or not) can help inform the design of new systems in this area. We conducted an evaluation of the visual analytics system Jigsaw employed in a small investigative sensemaking exercise, and compared its use to three other more traditional methods of analysis. Sixteen participants performed a simulated intelligence analysis task under one of the four conditions. Experimental results suggest that Jigsaw assisted participants to analyze the data and identify an embedded threat. We describe different analysis strategies used by study participants and how computational support (or the lack thereof) influenced the strategies. We then illustrate several characteristics of the sensemaking process identified in the study and provide design implications for investigative analysis tools based thereon. We conclude with recommendations on metrics and techniques for evaluating visual analytics systems for investigative analysis.
Visual analytics, information visualization, evaluation, investigative analysis, user study.
Youn-ah Kang, Carsten Görg, John Stasko, "How Can Visual Analytics Assist Investigative Analysis? Design Implications from an Evaluation", IEEE Transactions on Visualization & Computer Graphics, vol.17, no. 5, pp. 570-583, May 2011, doi:10.1109/TVCG.2010.84
[1] C. Plaisant, J.-D. Fekete, and G. Grinstein, “Promoting Insight-Based Evaluation of Visualizations: From Contest to Benchmark Repository,” IEEE Trans. Visualization and Computer Graphics, vol. 14, no. 1, pp. 120-134, Jan. 2008.
[2] J.J. Thomas and K.A. Cook, Illuminating the Path. IEEE CS, 2005.
[3] C. Plaisant, “The Challenge of Information Visualization Evaluation,” Proc. Working Conf. Advanced Visual Interfaces (AVI), pp. 109-116, May 2004.
[4] P. Saraiya, C. North, and K. Duca, “An Insight-Based Methodology for Evaluating Bioinformatics Visualizations,” IEEE Trans. Visualization and Computer Graphics, vol. 11, no. 4, pp. 443-456, July 2005.
[5] E. Bier, S. Card, and J. Bodnar, “Entity-Based Collaboration Tools for Intelligence Analysis,” Proc. IEEE Symp. Visual Analytics Science and Technology (VAST), pp. 99-106, Oct. 2008.
[6] D.H. Jeong, W. Dou, H. Lipford, F. Stukes, R. Chang, and W. Ribarsky, “Evaluating the Relationship between User Interaction and Financial Visual Analysis,” Proc. IEEE Symp. Visual Analytics Science and Technology (VAST), pp. 83-90, Oct. 2008.
[7] W. Wright, D. Schroh, P. Proulx, A. Skaburskis, and B. Cort, “The Sandbox for Analysis: Concepts and Methods,” Proc. ACM Conf. Human Factors in Computing Systems (CHI), pp. 801-810, Apr. 2006.
[8] J. Stasko, C. Görg, and Z. Liu, “Jigsaw: Supporting Investigative Analysis through Interactive Visualization,” Information Visualization, vol. 7, no. 2, pp. 118-132, 2008.
[9] J. Scholtz, “Beyond Usability: Evaluation Aspects of Visual Analytic Environments,” Proc. IEEE Symp. Visual Analytics Science and Technology (VAST), pp. 145-150, Oct. 2006.
[10] Y. Kang, C. Görg, and J. Stasko, “The Evaluation of Visual Analytics Systems for Investigative Analysis: Deriving Design Principles from a Case Study,” Proc. IEEE Symp. Visual Analytics Science and Technology (VAST), pp. 139-146, Oct. 2009.
[11] G. Chin, O.A. Kuchar, and K.E. Wolf, “Exploring the Analytical Processes of Intelligence Analysts,” Proc. ACM Conf. Human Factors in Computing Systems (CHI), pp. 11-20, Apr. 2009.
[12] A. Perer and B. Shneiderman, “Integrating Statistics and Visualization: Case Studies of Gaining Clarity during Exploratory Data Analysis,” Proc. ACM Conf. Human Factors in Computing Systems (CHI), pp. 265-274, Apr. 2008.
[13] B. Shneiderman and C. Plaisant, “Strategies for Evaluating Information Visualization Tools: Multi-Dimensional In-Depth Long-Term Case Studies,” Proc. AVI Workshop BEyond Time and Errors: Novel Evaluation Methods for Information Visualization (BELIV), pp. 1-7, May 2006.
[14] P. Pirolli and S. Card, “The Sensemaking Process and Leverage Points for Analyst Technology as Identified through Cognitive Task Analysis,” Proc. Int'l Conf. Intelligence Analysis, May 2005.
[15] A. Robinson, “Collaborative Synthesis of Visual Analytic Results,” Proc. IEEE Symp. Visual Analytics Science and Technology (VAST), pp. 67-74, Oct. 2008.
[16] B. Shneiderman, “The Eyes Have It: A Task by Data Type Taxonomy for Information Visualizations,” Proc. IEEE Symp. Visual Languages, pp. 336-343, Sept. 1996.
[17] R. Chang, C. Ziemkiewicz, T.M. Green, and W. Ribarsky, “Defining Insight for Visual Analytics,” IEEE Computer Graphics and Applications, vol. 29, no. 2, pp. 14-17, Mar. 2009.
[18] R. Heuer, Psychology of Intelligence Analysis. Center for the Study of Intelligence, Central Intelligence Agency, 1999.
[19] i2—Analyst's Notebook, http:/, 2010.
30 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool