The Community for Technology Leaders
RSS Icon
Issue No.06 - November/December (2010 vol.16)
pp: 1613-1622
Lingyun Yu , University of Groningen
Pjotr Svetachov , University of Groningen
Maarten H. Everts , University of Groningen
Tobias Isenberg , University of Groningen
We present the design and evaluation of FI3D, a direct-touch data exploration technique for 3D visualization spaces. The exploration of three-dimensional data is core to many tasks and domains involving scientific visualizations. Thus, effective data navigation techniques are essential to enable comprehension, understanding, and analysis of the information space. While evidence exists that touch can provide higher-bandwidth input, somesthetic information that is valuable when interacting with virtual worlds, and awareness when working in collaboration, scientific data exploration in 3D poses unique challenges to the development of effective data manipulations. We present a technique that provides touch interaction with 3D scientific data spaces in 7 DOF. This interaction does not require the presence of dedicated objects to constrain the mapping, a design decision important for many scientific datasets such as particle simulations in astronomy or physics. We report on an evaluation that compares the technique to conventional mouse-based interaction. Our results show that touch interaction is competitive in interaction speed for translation and integrated interaction, is easy to learn and use, and is preferred for exploration and wayfinding tasks. To further explore the applicability of our basic technique for other types of scientific visualizations we present a second case study, adjusting the interaction to the illustrative visualization of fiber tracts of the brain and the manipulation of cutting planes in this context.
Direct-touch interaction, wall displays, 3D navigation and exploration, evaluation, illustrative visualization
Lingyun Yu, Pjotr Svetachov, Petra Isenberg, Maarten H. Everts, Tobias Isenberg, "FI3D: Direct-Touch Interaction for the Exploration of 3D Scientific Visualization Spaces", IEEE Transactions on Visualization & Computer Graphics, vol.16, no. 6, pp. 1613-1622, November/December 2010, doi:10.1109/TVCG.2010.157
[1] D. Akers, Wizard of Oz for Participatory Design: Inventing a Gestural Interface for 3D Selection of Neural Pathway Estimates. In CHI Extended Abstracts, pp. 454–459, New York, 2006. ACM. doi: 10.1145/1125451. 1125552
[2] D. Akers, A. Sherbondy, R. Mackenzie, R. Dougherty, and B. Wandell, Exploration of the Brain's White Matter Pathways with Dynamic Queries. In Proc. Visualization, pp. 377–384, Los Alamitos, 2004. IEEE Computer Society. doi: 10.1109/VISUAL.2004.30
[3] R. Bade, F. Ritter, and B. Preim, Usability Comparison of Mouse-Based Interaction Techniques for Predictable 3D Rotation. In Proc. Smart Graphics, pp. 138–150, Berlin/Heidelberg, 2005. Springer Verlag. doi: 10.1007/11536482 12
[4] J. Blaas, C. P. Botha, B. Peters, F. M. Vos, and F. H. Post, Fast and Reproducible Fiber Bundle Selection in DTI Visualization. In Proc. Visualization, pp. 59–64, Los Alamitos, 2005. IEEE Computer Society. doi: 10.1109/VIS.2005.40
[5] R. Blanch, Y. Guiard, and M. Beaudouin-Lafon, Semantic Pointing: Improving Target Acquisition with Control-Display Ratio Adaptation. In Proc. CHI, pp. 519–526, New York, 2004. ACM. doi: 10.1145/985692. 985758
[6] D. A. Bowman, E. Kruijff, J. J. La Viola, Jr., and I. Poupyrev, 3D User Interfaces: Theory and Practice. Addison-Wesley, Boston, 2005.
[7] S. Bryson, Virtual Reality in Scientific Visualization. Communications of the ACM, 39 (5): 62–71, May 1996. doi: 10.1145/229459.229467
[8] M. Clifton and A. Pang, Cutting Planes and Beyond. Computers & Graphics, 21 (5): 563–575, May 1997. doi: 10.1016/S0097-8493(97)00036-8
[9] L. D. Cutler, B. Fröhlich, and P. Hanrahan, Two-Handed Direct Manipulation on the Responsive Workbench. In Proc. SI3D, pp. 107–114, New York, 1997. ACM. doi: 10.1145/253284.253315
[10] J.-B. de la Rivière, C. Kervégant, E. Orvain, and N. Dittlo, CubTile: A Multi-Touch Cubic Interface. In Proc. VRST, pp. 69–72, New York, 2008. ACM. doi: 10.1145/1450579.1450593
[11] J. Edelmann, S. Fleck, and A. Schilling, The DabR – A Multitouch System for Intuitive 3D Scene Navigation. In Proc. 3DTV, Piscataway, NJ, USA, 2009. IEEE. doi: 10.1109/3DTV.2009.5069671
[12] C. Forlines, A. Esenther, C. Shen, D. Wigdor, and K. Ryall, Multi-User, Multi-Display Interaction with a Single-User, Single-Display Geospatial Application. In Proc. UIST, pp. 273–276, New York, 2006. ACM. doi: 10.1145/1166253.1166296
[13] C. Forlines and R. Lilien, Adapting a Single-User, Single-Display Molecular Visualization Application for Use in a Multi-User, Multi-Display Environment. In Proc. AVI, pp. 367–371, New York, 2008. ACM. doi: 10.1145/1385569.1385635
[14] C. Forlines and C. Shen, DTLens: Multi-User Tabletop Spatial Data Exploration. In Proc. UIST, pp. 119–122, New York, 2005. ACM. doi: 10.1145/1095034.1095055
[15] C. Forlines, D. Wigdor, C. Shen, and R. Balakrishnan, Direct-Touch vs. Mouse Input for Tabletop Displays. In Proc. CHI, pp. 647–656, New York, 2007. ACM. doi: 10.1145/1240624.1240726
[16] M. Frisch, J. Heydekorn, and R. Dachselt, Investigating Multi-Touch and Pen Gestures for Diagram Editing on Interactive Surfaces. In Proc. ITS, pp. 149–156, New York, 2009. ACM. doi: 10.1145/1731903.1731933
[17] B. Fröhlich, J. Plate, J. Wind, G. Wesche, and M. Göbel., Cubic-Mouse-Based Interaction in Virtual Environments. IEEE Computer Graphics and Applications, 20 (4): 12–15, July/Aug. 2000. doi: 10.1109/38.851743
[18] C.-W. Fu, W.-B. Goh, and J. A. Ng, Multi-Touch Techniques for Exploring Large-Scale 3D Astrophysical Simulations. In Proc. CHI, pp. 2213–2222, New York, 2010. ACM. doi: 10.1145/1753326.1753661
[19] M. Hachet, F. Decle, S. Knödel, and P. Guitton, Navidget for Easy 3D Camera Positioning from 2D Inputs. In Proc. 3DUI, pp. 83–89, Los Alamitos, 2008. IEEE Computer Society. doi: 10.1109/3DUI.2008.4476596
[20] J. Y. Han, Low-Cost Multi-Touch Sensing Through Frustrated Total Internal Reflection. In Proc. UIST, pp. 115–118, New York, 2005. ACM. doi: 10.1145/1095034.1095054
[21] M. Hancock, S. Carpendale, and A. Cockburn, Shallow-Depth 3D Interaction: Design and Evaluation of One-, Two-and Three-Touch Techniques. In Proc. CHI, pp. 1147–1156, New York, 2007. ACM. doi: 10. 1145/1240624.1240798
[22] M. Hancock, T. ten Cate, and S. Carpendale, Sticky Tools: Full 6DOF Force-Based Interaction for Multi-Touch Tables. In Proc. ITS, pp. 145– 152, New York, 2009. ACM. doi: 10.1145/1731903.1731930
[23] M. S. Hancock, S. Carpendale, F. D. Vernier, D. Wigdor, and C. Shen, Rotation and Translation Mechanisms for Tabletop Interaction. In Proc. Tabletop, pp. 79–88, Los Alamitos, 2006. IEEE Computer Society. doi: 10.1109/TABLETOP.2006.26
[24] E. Hornecker, P. Marshall, N. S. Dalton, and Y. Rogers, Collaboration and Interference: Awareness with Mice or Touch Input. In Proc. CSCW, pp. 167–176, New York, 2008. ACM. doi: 10.1145/1460563.1460589
[25] T. Isenberg, M. Everts, J. Grubert, and S. Carpendale, Interactive Exploratory Visualization of 2D Vector Fields. Computer Graphics Forum, 27 (3): 983–990, May 2008. doi: 10.1111/j.1467-8659.2008.01233.x
[26] Y. Jung, J. Keil, J. Behr, S. Webel, M. Zöllner, T. Engelke, H. Wuest, and M. Becker, Adapting X3D for Multi-Touch Environments. In Proc. Web3D, pp. 27–30, New York, 2008. ACM. doi: 10.1145/1394209. 1394218
[27] D. F. Keefe, Integrating Visualization and Interaction Research to Improve Scientific Workflows. IEEE Computer Graphics and Applications, 30 (2): 8–13, Mar./Apr. 2010. doi: 10.1109/MCG.2010.30
[28] K. Kin, M. Agrawala, and T. De Rose, Determining the Benefits of Direct-Touch, Bimanual, and Multifinger Input on a Multitouch Workstation. In Proc. Graphics Interface, pp. 119–124, Toronto, 2009. CIPS.
[29] R. Kosara, H. Hauser, and D. L. Gresh, An Interaction View on Information Visualization. In Eurographics State-of-the-Art Reports, pp. 123– 137, Aire-la-Ville, Switzerland, 2003. Eurographics.
[30] R. Kruger, S. Carpendale, S. D. Scott, and A. Tang, Fluid Integration of Rotation and Translation. In Proc. CHI, pp. 601–610, New York, 2005. ACM. doi: 10.1145/1054972.1055055
[31] W. Kröhlich, The Responsive Workbench. IEEE Computer Graphics and Applications, 14 (3): 12–15, May 1994. doi: 10.1109/ 38.279036
[32] J. Liu, D. Pinelle, S. Sallam, S. Subramanian, and C. Gutwin, TNT: Improved Rotation and Translation on Digital Tables. In Proc. Graphics Interface, pp. 25–32, Toronto, 2006. CIPS.
[33] I. S. MacKenzie and S. Riddersma, Effects of Output Display and Con-trol-Display Gain on Human Performance in Interactive Systems. Behaviour & Information Technology, 13 (5): 328–337, 1994. doi: 10.1080/ 01449299408914613
[34] A. Martinet, G. Casiez, and L. Grisoni, 3D Positioning Techniques for Multi-Touch Displays. In Proc. VRST, pp. 227–228, New York, 2009. ACM. doi: 10.1145/1643928.1643978
[35] A. Martinet, G. Casiez, and L. Grisoni, Design and Evaluation of 3D Positioning Techniques for Multi-touch Displays. Technical Report RR7015, INRIA, France, 2009.
[36] T. Meyer and A. Globus, Direct Manipulation of Isosurfaces and Cutting Planes in Virtual Environments. Technical Report CS-93-54, Brown University, Providence, RI, USA, Dec. 1993.
[37] M. A. Nacenta, P. Baudisch, H. Benko, and A. Wilson, Separability of Spatial Manipulations in Multi-Touch Interfaces. In Proc. Graphics Interface, pp. 175–182, Toronto, 2009. CIPS.
[38] M. Nijboer, M. Gerl, and T. Isenberg, Exploring Frame Gestures for Fluid Freehand Sketching. In Proc. SBIM, pp. 57–62, Aire-la-Ville, Switzerland, 2010. Eurographics. doi: 10.2312/SBM/SBM10/057-062
[39] C. North, T. Dwyer, B. Lee, D. Fisher, P. Isenberg, K. Inkpen, and G. Robertson, Understanding Multi-touch Manipulation for Surface Computing. In Proc. Interact, pp. 236–249, Berlin/Heidelberg, 2009. Springer Verlag. doi: 10.1007/978-3-642-03658-3 31
[40] W. A. Pike, J. Stasko, R. Chang, and T. A. O'Connell, The Science of Interaction. Information Visualization, 8 (4): 263–274, Winter 2009. doi: 10.1057/ivs.2009.22
[41] J. L. Reisman, P. L. Davidson, and J. Y. Han, A Screen-Space Formulation for 2D and 3D Direct Manipulation. In Proc. UIST, pp. 69–78, New York, 2009. ACM. doi: 10.1145/1622176.1622190
[42] G. Robles-De-La-Torre, The Importance of the Sense of Touch in Virtual and Real Environments. IEEE MultiMedia, 13 (3): 24–30, July–Sept. 2006. doi: 10.1109/MMUL.2006.69
[43] I. Rosenberg and K. Perlin, The UnMousePad: An Interpolating Multi-Touch Force-Sensing Input Pad. ACM Transactions on Graphics, 28(3):65:1–65:9, Aug. 2009. doi: 10.1145/1531326.1531371
[44] D. Schroeder, D. Coffey, and D. F. Keefe, Drawing with the Flow: A Sketch-Based Interface for Illustrative Visualization of 2D Vector Fields. In Proc. SBIM pp. 49–56, Aire-la-Ville,Switzerland,2010.Eurographics. doi: 10.2312/SBM/SBM10/049-056
[45] K. Shoemake, ARCBALL: A User Interface for Specifying Three-Dimensional Orientation Using a Mouse. In Proc. Graphics Interface, pp. 151–156, San Francisco, 1992. Morgan Kaufmann Publishers Inc.
[46] Smart Technologies Inc. Digital Vision Touch Technology. White paper, Feb. 2003.
[47] V. Springel, J. Wang, M. Vogelsberger, A. Ludlow, A. Jenkins, A. Helmi, J. F. Navarro, C. S. Frenk, and S. D. M. White, The Aquarius Project: The Subhalos of Galactic Halos. Monthly Notices of the Royal Astronomical Society, 391 (4): 1685–1711, Dec. 2008. doi: 10.1111/j.1365-2966.2008. 14066.x
[48] M. A. Srinivasan and C. Basdogan, Haptics in Virtual Environments: Taxonomy, Research Status, and Challenges. Computers & Graphics, 21 (4): 393–404, July/Aug. 1997. doi: 10.1016/S0097-8493(97)00030-7
[49] F. Steinicke, K. H. Hinrichs, J. Schöning, and A. Krüger, Multi-Touching 3D Data: Towards Direct Interaction in Stereoscopic Display Environments coupled with Mobile Devices. In Proc. AVI Workshop on Designing Multi-Touch Interaction Techniques for Coupled Public and Private Displays, pp. 46–49, 2008.
[50] P. Svetachov, M. H. Everts, and T. Isenberg, DTI in Context: Illustrating Brain Fiber Tracts In Situ. Computer Graphics Forum, 29 (3): 1024–1032, June 2010. doi: 10.1111/j.1467-8659.2009.01692.x
[51] R. Wang, T. Benner, A. G. Sorensen, and V. J. W. Wedeen, Diffusion Toolkit: A Software Package for Diffusion Imaging Data Processing and Tractography. In Proc. ISMRM, volume 15, p. 3720, 2007.
[52] A. D. Wilson, Simulating Grasping Behavior on an Imaging Interactive Surface. In Proc. ITS, pp. 137–144, New York, 2009. ACM. doi: 10. 1145/1731903.1731929
[53] A. D. Wilson, S. Izadi, O. Hilliges, A. Garcia-Mendoza, and D. Kirk, Bringing Physics to the Surface. In Proc. UIST, pp. 67–76. ACM, 2008. doi: 10.1145/1449715.1449728
[54] J. S. Yi, Y. a. Kang, J. Stasko, and J. Jacko, Toward a Deeper Understanding of the Role of Interaction in Information Visualization. IEEE Transactions on Visualization and Computer Graphics, 13 (6): 1224–1231, Nov./Dec. 2007. doi: 10.1109/TVCG.2007.70515
[55] R. Zeleznik and A. Forsberg, UniCam—2D Gestural Camera Controls for 3D Environments. In Proc. I3D, pp. 169–173, New York, 1999. ACM. doi: 10.1145/300523.300546
34 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool