The Community for Technology Leaders
RSS Icon
Issue No.01 - January/February (2010 vol.16)
pp: 4-16
Steven Henderson , Columbia University, New York
Opportunistic Controls are a class of user interaction techniques that we have developed for augmented reality (AR) applications to support gesturing on, and receiving feedback from, otherwise unused affordances already present in the domain environment. By leveraging characteristics of these affordances to provide passive haptics that ease gesture input, Opportunistic Controls simplify gesture recognition, and provide tangible feedback to the user. In this approach, 3D widgets are tightly coupled with affordances to provide visual feedback and hints about the functionality of the control. For example, a set of buttons can be mapped to existing tactile features on domain objects. We describe examples of Opportunistic Controls that we have designed and implemented using optical marker tracking, combined with appearance-based gesture recognition. We present the results of two user studies. In the first, participants performed a simulated maintenance inspection of an aircraft engine using a set of virtual buttons implemented both as Opportunistic Controls and using simpler passive haptics. Opportunistic Controls allowed participants to complete their tasks significantly faster and were preferred over the baseline technique. In the second, participants proposed and demonstrated user interfaces incorporating Opportunistic Controls for two domains, allowing us to gain additional insights into how user interfaces featuring Opportunistic Controls might be designed.
Haptic I/O, interaction styles, user interfaces, virtual and augmented reality.
Steven Henderson, "Opportunistic Tangible User Interfaces for Augmented Reality", IEEE Transactions on Visualization & Computer Graphics, vol.16, no. 1, pp. 4-16, January/February 2010, doi:10.1109/TVCG.2009.91
[1] 3DVSystems, http:/, Feb. 2009.
[2] US Army, “Functional Users Manual for the Army Maintenance Management System-Aviation (pam 738-751),” 1992.
[3] M. Bernstein, J. Shrager, and T. Winograd, “Taskposé: Exploring Fluid Boundaries in an Associative Window Visualization,” Proc. 21st Symp. User Interface Software and Technology (UIST '08), pp.231-234, 2008.
[4] M. Billinghurst, H. Kato, and I. Poupyrev, “The Magicbook— Moving Seamlessly between Reality and Virtuality,” IEEE Computer Graphics and Applications, vol. 21, no. 3, pp. 6-8, May/June 2001.
[5] G. Blaskó and S. Feiner, “An Interaction System for Watch Computers Using Tactile Guidance and Bidirectional Segmented Strokes,” Proc. Eighth Int'l Symp. Wearable Computers, pp. 120-123, 2004.
[6] G. Bleser and D. Stricker, “Advanced Tracking through Efficient Image Processing and Visual-Inertial Sensor Fusion,” Proc. IEEE Virtual Reality Conf., pp. 137-144, 2008.
[7] F.P. BrooksJr., M. Ouh-Young, J.J. Batter, and P. Jerome Kilpatrick, “Project GROPE-Haptic Displays for Scientific Visualization,” Proc. 17th Ann. Conf. Computer Graphics and Interactive Techniques, pp. 177-185, 1990.
[8] W. Buxton, R. Hill, and P. Rowley, “Issues and Techniques in Touch-Sensitive Tablet Input,” Proc. ACM SIGGRAPH Computer Graphics, vol. 19, no. 3, pp. 215-224, 1985.
[9] B.D. Conner, S.S. Snibbe, K.P. Herndon, D.C. Robbins, R.C. Zeleznik, and A. van Dam, “Three-Dimensional Widgets,” Proc. Symp. Interactive 3D Graphics, pp. 183-188, 1992.
[10] Cooliris, http:/, Feb. 2009.
[11] R. Dachselt and M. Hinz, “Three-Dimensional Widgets Revisited—Towards Future Standardization,” Proc. IEEE Virtual Reality Workshop New Directions in 3D User Interfaces (VR), 2005.
[12] J.A. Fails and D. OlsenJr., “Light Widgets: Interacting in Every-Day Spaces,” Proc. Seventh Int'l Conf. Intelligent User Interfaces, pp.63-69, 2002.
[13] M. Fiala, “ARTag, a Fiducial Marker System Using Digital Techniques,” Proc. IEEE Conf. Computer Vision and Pattern Recognition (CVPR '05), vol. 2, pp. 590-596, 2005.
[14] K.P. Fishkin, “A Taxonomy for and Analysis of Tangible Interfaces,” Personal and Ubiquitous Computing, vol. 8, no. 5, pp.347-358, 2004.
[15] J. Gibson, The Ecological Approach to Visual Perception. Lawrence Erlbaum Assoc., 1986.
[16] S. Henderson and S. Feiner, “Opportunistic Controls: Leveraging Natural Affordances as Tangible User Interfaces for Augmented Reality,” Proc. Symp. Virtual Reality Software and Technology (VRST'08), pp. 211-218, 2008.
[17] K. Hinckley, R. Pausch, J.C. Goble, and N.F. Kassell, “Passive Real-World Interface Props for Neurosurgical Visualization,” Proc. ACM SIGCHI Conf. Human Factors in Computing Systems, pp. 452-458, 1994.
[18] B.E. Insko, “Passive Haptics Significantly Enhances Virtual Environments,” PhD dissertation, Univ. of North Carolina, 2001.
[19] H. Ishii and B. Ullmer, “Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms,” Proc. ACM SIGCHI Conf. Human Factors in Computing Systems, pp. 234-241, 1997.
[20] R. Kjeldsen and J. Kender, “Finding Skin in Color Images,” Proc. Second Int'l Conf. Automatic Face and Gesture Recognition (FG '96), pp. 312-317, 1996.
[21] G. Klein and D. Murray, “Parallel Tracking and Mapping for Small AR Workspaces,” Proc. Int'l Symp. Mixed and Augmented Reality (ISMAR '07), pp. 225-234, 2007.
[22] R.W. Lindeman, J.L. Sibert, and J.K. Hahn, “Hand-Held Windows: Towards Effective 2d Interaction in Immersive Virtual Environments,” Proc. IEEE Virtual Reality Conf., pp. 205-212, 1999.
[23] J. Mikkonen, J. Vanhala, A. Reho, and J. Impi, “Reima Smart Shout Concept and Prototype,” Proc. Fifth Int'l Symp. Wearable Computers (ISWC '01), pp. 174-175, 2001.
[24] R. Murray-Smith, J. Williamson, S. Hughes, and T. Quaade, “Stane: Synthesized Surfaces for Tactile Input,” Proc. ACM SIGCHI Conf. Human Factors in Computing Systems, pp. 1299-1302, 2008.
[25] D. Norman, The Psychology of Everyday Things. Basic Books, 1988.
[26] O. Oda, L.J. Lister, S. White, and S. Feiner, “Developing an Augmented Reality Racing Game,” Proc. Second Int'l Conf. Intelligent Technologies for Interactive Entertainment (INTETAIN'08), pp. 1-8, 2007.
[27] H. Roeber, J. Bacus, and C. Tomasi, “Typing in Thin Air: The Canesta Projection Keyboard—A New Method of Interaction with Electronic Devices,” Proc. Conf. Human Factors in Computing Systems (CHI '03) Extended Abstracts on Human Factors in Computing Systems, pp. 712-713, 2003.
[28] Z. Szalavari and M. Gervautz, “The Personal Interaction Panel—A Two-Handed Interface for Augmented Reality,” Computer Graphics Forum, vol. 16, no. 3, pp. 335-346, 1997.
[29] C. Tomasi, A. Rafii, and I. Torunoglu, “Full-Size Projection Keyboard for Handheld Devices,” Comm. ACM, vol. 46, no. 7, pp. 70-75, 2003.
[30] D. Weimer and S.K. Ganapathy, “A Synthetic Visual Environment with Hand Gesturing and Voice Input,” Proc. ACM SIGCHI Conf. Human Factors in Computing Systems, pp. 235-240, 1989.
20 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool