The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.12 - Dec. (2011 vol.17)
pp: 2498-2507
David Lloyd , giCentre, City University London
Jason Dykes , giCentre, City University London
ABSTRACT
Working with three domain specialists we investigate human-centered approaches to geovisualization following an ISO13407 taxonomy covering context of use, requirements and early stages of design. Our case study, undertaken over three years, draws attention to repeating trends: that generic approaches fail to elicit adequate requirements for geovis application design; that the use of real data is key to understanding needs and possibilities; that trust and knowledge must be built and developed with collaborators. These processes take time but modified human-centred approaches can be effective. A scenario developed through contextual inquiry but supplemented with domain data and graphics is useful to geovis designers. Wireframe, paper and digital prototypes enable successful communication between specialist and geovis domains when incorporating real and interesting data, prompting exploratory behaviour and eliciting previously unconsidered requirements. Paper prototypes are particularly successful at eliciting suggestions, especially for novel visualization. Enabling specialists to explore their data freely with a digital prototype is as effective as using a structured task protocol and is easier to administer. Autoethnography has potential for framing the design process. We conclude that a common understanding of context of use, domain data and visualization possibilities are essential to successful geovis design and develop as this progresses. HC approaches can make a significant contribution here. However, modified approaches, applied with flexibility, are most promising. We advise early, collaborative engagement with data – through simple, transient visual artefacts supported by data sketches and existing designs – before moving to successively more sophisticated data wireframes and data prototypes.
INDEX TERMS
Evaluation, geovisualization, context of use, requirements, field study, prototypes, sketching, design.
CITATION
David Lloyd, Jason Dykes, "Human-Centered Approaches in Geovisualization Design: Investigating Multiple Methods Through a Long-Term Case Study", IEEE Transactions on Visualization & Computer Graphics, vol.17, no. 12, pp. 2498-2507, Dec. 2011, doi:10.1109/TVCG.2011.209
REFERENCES
[1] P.A. Alexander, J.M. Kulikowich, and S. K. Schulze, How subject-matter knowledge affects recall and interest. Am. Ed. Res. J., 31 (2): 313– 337, 1994.
[2] G. Andrienko, N. Andrienko, U. Demšar, D. Dransch, J. Dykes, S. Fab-Rikant, M. Jern, M.-J. Kraak, H. Schumann, and C. Tominski, Space, Time, and Visual Analytics. Int. J.GIS, 24 (10): 1577–1600, 2010.
[3] J. Arnowitz, M. Arent, and N. Berger, Effective Prototyping for Software Makers. Morgan Kaufmann, Amsterdam, Netherlands, 2007.
[4] E. Bertini, Visuale: The neglected role of interaction in information visualization. http://visuale.bertini.me?m=200705, 2007.
[5] H. Beyer and K. Holtzblatt, Contextual Design: Defining Customer-Centered Systems. Morgan Kaufmann, San Francisco, CA, 1997.
[6] C. M. Bird, How I Stopped Dreading and Learned to Love Transcription. Qualitative Inquiry 11 (2): 226–248, 2005.
[7] M. Bostock and J. Heer, ProtoVis: A Graphical Toolkit for Visualization. IEEE Trans.Vis & Comp.Graph., 15 (6): 1121–1128, 2009.
[8] S. Carpendale, Evaluating information visualizations. In A. Kerren, J. Stasko, J.-D. Fekete, and C. North editors Information Visualization volume 4950 of LNCS, pages 19–45. Springer Berlin / Heidelberg, 2008.
[9] J. M. Carroll, Five reasons for scenario-based design. Interacting with Computers 13 (1): 43–60, 2000.
[10] M. Catani, and D. Biers, Usability evaluation and prototype fidelity: Users and usability professionals. Proc. Human Factors and Ergonomics Society 42 (19): 1331–1335, 1998.
[11] W. Chao, T. Munzner, and M. van de Panne, Poster: Rapid Pen-Centric Authoring of Improvisational Visualizations with NapkinVis. In Proc. IEEE Info Vis 2010. IEEE, 2010.
[12] B. Craft and P. Cairns, Using Sketching to Aid the Collaborative Design of Information Visualisation Software - A Case Study. In Human Work Interaction Design Madeira, Portugal, 2006.
[13] N. Dahlback, A. Jonsson, and L. Ahrenberg, Wizard of Oz Studies–Why and How. Knowledge-Based Systems 6 (4): 258—266, 1993.
[14] J. S. Dumas and J. C. Redish, A practical guide to Usability Testing. intellect, Exeter, UK, 1999.
[15] M. Duncan, Autoethnography: Critical Appreciation of an Emerging Art. International Journal of Qualitative Methods 3 (4): 28–39, 2004.
[16] J. Dykes, Facilitating interaction for geovisualization. In J. Dykes, A. MacEachren, and M.-J. Kraak editors, Exploring Geovisualization pages 265–291. Elsevier, 2005.
[17] J. Dykes, A. MacEachren, and M.-J. Kraak, Exploring Geovisualization. Elsevier, 2005.
[18] J. Dykes, J. Wood, and A. Slingsby, Rethinking Map Legends with Visualization. IEEE Trans.Vis & Comp.Graph., 16 (6): 890—899, 2010.
[19] C. Ellis, The Ethnographic I: A Methodological Novel About Autoethnography. Rowman Altamira, Lanham, MD, 2004.
[20] G. Ellis and A. Dix, An explorative analysis of user evaluation studies in information visualisation. In BELIV '06, Venice, Italy, 2006.
[21] K. A. Ericson and H. A. Simon, Protocol Analysis: Verbal Reports as Data. MIT Press, Cambridge, MA, 1984.
[22] J. Gerring, What Is A Case Study and What Is It Good for? Am.Pol.Sc.Rev., 98 (2): 341–354, 2004.
[23] S. Greenberg and B. Buxton, Usability evaluation considered harmful (some of the time). In CHI 2008, Florence, Italy, 2008.
[24] J. Greene, V. Caracelli, and W. Graham, Toward a Conceptual Framework for Mixed-Method Evaluation Designs. Ed. Eval & Pol.An., 11 (3): 255– 274, 1989.
[25] S. Hannah, Sorting out card sorting. PhD thesis, Univ. Oregon, 2005.
[26] P. Isenberg, T. Zuk, C. Collins, and S. Carpendale, Grounded evaluation of information visualizations. In BELIV '08, pages 56–63, New York, NY, 2008. ACM.
[27] ISO. ISO 13407:1999 Human-centred design processes for interactive systems, 1999.
[28] M.C. Jones, I.R. Floyd, and M.B. Twidale, Patchwork Prototyping with Open-Source Software. In K. St.Amant, and B. Still editors , The Handbook of Research on Open Source Software: Technological, Economic, and Social Perspectives. IGI Global, Hershey, PA, 2007.
[29] A. Kerren, J.T. Stasko, and J.A. Dykes, Teaching Information Visualization. LNCS, 4950: 65–91, 2008.
[30] R. Kosara, F. Drury, L.E. Holmquist, and D.H. Laidlaw, Visualization Criticism. IEEE Comp.Graph., 28 (3): 13–15, 2008.
[31] K. Krippendorff, Content Analysis: An Introduction to its Methodology. Sage Publications, Thousand Oaks, CA, 2003.
[32] J. Krygier, C. Reeves, D. DiBiase, and J. Cupp, Design, implementation and evaluation of multimedia resources for geography and earth science education. J.Geog. in Higher Ed., 21 (1): 17–39, 1997.
[33] J. C. Lapadat and A.C. Lindsay, Transcription in Research and Practice: From Standardization of Technique to Interpretive Positionings. Qualitative Inquiry, 5 (1): 64–86, 1999.
[34] A. Lewins and C. Silver, Using Software in Qualitative Research: A Step-by-Step Guide. Sage Publications Ltd., London, UK, 2007.
[35] Y. Lim, A. Pangam, S. Periyasami, and S. Aneja, Comparative analysis of high-and low-fidelity prototypes for more valid usability evaluations of mobile devices. In 4th Nordic Conf. on HCI, pages 291–300, Oslo, Norway, 2006.
[36] D. Lloyd, Evaluating human-centered approaches for geovisualization. PhD thesis, City University London, 2009.
[37] D. Lloyd, J. Dykes, and R. Radburn, Understanding geovisualization users and their requirements – a user-centred approach. In A. Winstanley editor, GISRUK 15,, Maynooth, Ireland, 2007.
[38] D. Lloyd, J. Dykes, and R. Radburn, Mediating geovisualization to potential users and prototyping a geovisualization application. In D. Lambrick editor GISRUK 16, Manchester, UK, 2008.
[39] D. Lloyd, J. Dykes, and R. Radburn, Using the Analytic Hierarchy Process to prioritise candidate improvements to a geovisualization application. In D. Fairbairn editor, GISRUK 17, Durham, UK, 2009.
[40] H. P. Luhn, Keyword-in-context index for technical literature. Am.Doc., 11 (4): 288–295, 1960.
[41] M. Maguire, Methods to support human-centred design. Int.J. HCStud., 55: 587–634, 2001.
[42] N. Maiden, A. Gizikis, and S. Robertson, Provoking creativity: imagine what your requirements could be like. IEEE Software, 21 (5): 68–75, 2004.
[43] M. B. Miles and A.M. Huberman, Qualitative Data Analysis: An Expanded Sourcebook. Sage, Thousand Oaks, CA, 2nd edition, 1994.
[44] J. Nielsen and D. Sano, SunWeb: user interface design for Sun Microsystem's internal Web. Comp.Net. & ISDN Sys., 28 (1-2): 179–188, 1995.
[45] C. North, Toward Measuring Visualization Insight. IEEE Comp.Graph. & Apps, 26 (3): 6–9, 2006.
[46] J.F. Nunamaker, A.R. Dennis, J.S. Valacich, D. Vogel, and J.F. George, Electronic meeting systems. Comm. ACM, 34 (7): 40–61, 1991.
[47] C. Plaisant, The Challenge of Information Visualization Evaluation. In AVI 04, Gallipoli, Italy, 2004.
[48] J. Preece, Y. Rogers, and H. Sharp, Interaction Design: beyond human-computer interaction. Wiley, Chichester, UK, 2002.
[49] R. Radburn, J. Dykes, and J. Wood, vizLib: using the seven stages of visualization to explore population trends and processes in local authority researc h. In GISRUK 18, page 10, London, UK, 2010.
[50] C. Reas and B. Fry, Processing: a learning environment for creating interactive Web graphics. In ACM SIGGRAPH 2003 Web Graphics, San Diego, CA, page 1, New York, NY, 2003. ACM.
[51] M. Rettig, Prototyping for tiny fingers. Comm. ACM, 37 (4): 21–27, 1994.
[52] S. Robertson, Requirements trawling: techniques for discovering requirements. Int.J. of HCStud., 55 (4): 405–422, 2001.
[53] S. Robertson, Scenarios in Requirements Discovery. In I. Alexander, and N. Maiden editors Scenarios, Stories, Use Cases: Through the Systems Development Life-cycle. Wiley, Chichester, UK, 2004.
[54] S. Robertson, and J. Robertson, Mastering the requirements process. ACM Press/Addison-Wesley, New York, NY, 2006.
[55] A. Robinson, A design framework for exploratory geovisualization in epidemiology. Information Visualization, 6 (3): 197–214, 2007.
[56] A. Robinson, J. Chen, E. Lengerich, H. Meyer, and A. MacEachren, Combining usability techniques to design geovisualization tools for epidemiology. Cart. & Geog.Inf.Science, 32 (4): 243–255, 2005.
[57] R. Roth, B. Finch, J. Blanford, A. Klippel, A. Robinson, and A. MacEachren, Card Sorting for Cartographic Research and Practice. Cart. & Geog.Inf.Science, 38 (2), 2011.
[58] R.E. Roth, K.S. Ross, B.G. Finch, W. Luo, and A. MacEachren, A user-centered approach for designing and developing spatiotemporal crime analysis tools. In GIScience 2010, Zurich, Switzerland, 2010.
[59] T. L. Saaty, A scaling method for priorities in hierarchical structures. J.Math. Psych., 15 (3): 234–281, 1977.
[60] B. Shneiderman and C. Plaisant, Strategies for evaluating information visualization tools: multi-dimensional in-depth long-term case studies. In BELIV '06, pages 1–7, Venice, Italy, 2006.
[61] A. Skupin and S. Fabrikant, Spatialization methods: A cartographic research agenda for non-geographic information visualization. Cart. & Geog.Inf.Science, 30 (2): 95–115, 2003.
[62] T. Slocum, C. Blok, B. Jiang, A. Koussoulakou, D. Montello, S. Furmann, and N. Hedley, Cognitive and Usability Issues in Geovisualization. Cart. & Geog.Inf.Science, 28 (1): 61–75, 2001.
[63] C. Snyder, Paper Prototyping: The Fast and Easy Way to Design and Refine User Interfaces. Morgan Kaufmann, San Francisco, CA, 2003.
[64] J. Tidwell, Designing Interfaces. O'Reilly, Sebastopol, CA, 2005.
[65] M. Tohidi, W. Buxton, R. Baecker, and A. Sellen, User sketches: a quick, inexpensive, and effective way to elicit more reflective user feedback. In 4th Nordic Conf. on HCI, pages 105–114, Oslo, Norway, 2006.
[66] T. S. Tullis, A method for evaluating Web page design concepts. In Human Factors in Computing Systems CHI 98, pages 323–324, Los Angeles, CA, 1998.
[67] J. J. van Wijk, Bridging the gaps. IEEE Comp.Graph. & Apps, 26 (6): 6–9, 2006.
[68] R.A. Virzi, J.L. Sokolov, and D. Karis, Usability problem identification using both low-and high-fidelity prototypes. In SIGCHI Conf. on Human Factors in Computing Systems, pages 236–243, Vancouver, BC, 1996.
[69] M. Walker, L. Takayama, and J. A. Landay, High-fidelity or low-fidelity, paper or computer? Choosing attributes when testing web prototypes. In Human Factors and Ergonomics Society 46, pages 661–665, Santa Monica, CA, 2002.
[70] E. Wasil and B. Golden, Celebrating 25 years of AHP-based decision making. Comp. & Op.Res., 30 (10): 1419–1420, 2003.
17 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool