This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
A Practical Paradigm and Platform for Video-Based Human-Computer Interaction
May 2008 (vol. 41 no. 5)
pp. 48-55
Jason J. Corso, Johns Hopkins University
Guangqi Ye, Johns Hopkins University
Darius Burschka, Johns Hopkins University
Gregory D. Hager, Johns Hopkins University
New technologies that use multimodal input, human experience, and modern hardware's full computational power could mitigate current limitations in human-computer interaction. The 4D Touchpad, a video-based interaction platform, makes robust, natural interaction between humans and computers possible.

1. J.J. Corso, G. Ye, and G.D. Hager, "Analysis of Composite Gestures with a Coherent Probabilistic Graphical Model," Virtual Reality, vol. 8, no. 4, 2005, pp. 242–252.
2. G. Ye et al., "VICs: A Modular HCI Framework Using Spatio-Temporal Dynamics," Machine Vision and Applications, vol. 16, no. 1, 2004, pp. 13–20.
3. M. Beaudouin-Lafon, "Instrumental Interaction: An Interaction Model for Designing Post-WIMP User Interfaces," Proc. SIGCHI Conf. Human Factors in Computing Systems, ACM Press, 2000, pp. 446–453.
4. A. van Dam, "Post-WIMP User Interfaces," Comm. ACM, vol. 40, no. 2, 1997, pp. 63–67.
5. B. Shneiderman, "Direct Manipulation: A Step beyond Programming Languages," Computer, Aug. 1983, pp. 57–69.
6. R. Sukthankar, R.G. Stockton, and M.D. Mullin, "Smarter Presentations: Exploiting Homography in Camera-Projector Systems," Proc. Int'l Conf. Computer Vision, vol. 1, IEEE CS Press, 2001, pp. 247–253.
7. G. Ye, J.J. Corso, and G.D. Hager, "Visual Modeling of Dynamic Gestures Using 3D Appearance and Motion Features," Real-Time Vision for Human-Computer Interaction, B. Kisacanin, V. Pavlovic, and T.S. Huang, eds., Springer, 2005, pp. 103–120.
1. C.R. Wren et al., "Pfinder: Real-Time Tracking of the Human Body," IEEE Trans. Pattern Analysis and Machine Intelligence (TPAMI), vol. 19, no. 7, 1997, pp. 780–785.
2. D.O. Gorodnichy and G. Roth, "Nouse 'Use Your Nose as a Mouse'—Perceptual Vision Technology for Hands-Free Games and Interfaces" Image and Vision Computing, vol. 22, no. 12, 2004, pp. 931–942.
3. C. von Hardenberg and F. Berard, "Bare-Hand Human-Computer Interaction," Proc. Workshop Perceptual User Interfaces, ACM Press, 2001, pp. 113–120.
4. T.P Moran et al., "Design and Technology for Collaborage: Collaborative Collages of Information on Physical Walls," Proc. ACM Symp. User Interface Software and Technology (UIST), ACM Press, 1999, pp. 197–206.
5. Q. Stafford-Fraser and P. Robinson, "BrightBoard: A Video-Augmented Environment," Proc. Conf. Human Factors in Computing Systems (CHI), ACM Press, 1996, pp. 134–141.
6. Z. Zhang et al., "Visual Panel: Virtual Mouse, Keyboard, and 3D Controller with an Ordinary Piece of Paper," Proc. Workshop Perceptual User Interfaces, ACM Press, 2001, pp. 1–8.
7. R. Kjeldsen, A. Levas, and C. Pinhanez, "Dynamically Reconfigurable Vision-Based User Interfaces," Proc. Int'l Conf. Computer Vision Systems, LNCS 2626, Springer, 2003, pp. 323–332.

Index Terms:
human-computer interaction, 4D Touchpad, video-based interaction platforms
Citation:
Jason J. Corso, Guangqi Ye, Darius Burschka, Gregory D. Hager, "A Practical Paradigm and Platform for Video-Based Human-Computer Interaction," Computer, vol. 41, no. 5, pp. 48-55, May 2008, doi:10.1109/MC.2008.141
Usage of this product signifies your acceptance of the Terms of Use.