Proceedings of the Second International Conference on Automatic Face and Gesture Recognition (1996)
Oct. 14, 1996 to Oct. 16, 1996
R. Kjeldsen , IBM Thomas J. Watson Res. Center, Yorktown Heights, NY, USA
J. Kender , IBM Thomas J. Watson Res. Center, Yorktown Heights, NY, USA
This work describes the design of a functioning user interface based on visual recognition of hand gestures, and details its performance. In the interface, gesture replaces the mouse for many actions including selecting, moving and resizing windows. A camera below the screen observes the user. The hand is segmented from the background using color. Features of the hand's motion are extracted from the sequence of segmented images, and when needed the hand's pose is classified using a neural net. This information is parsed by a task specific grammar. The system runs in real time on standard PC hardware. It has demonstrated its abilities with various users in several different office environments. Having experimented with a functioning gestural interface, the authors discuss the practicality and best applications of this technology.
feature extraction; traditional user interfaces; gesture; visual recognition; hand gestures; neural net; segmented images; office environments; task specific grammar
R. Kjeldsen and J. Kender, "Toward the use of gesture in traditional user interfaces," Proceedings of the Second International Conference on Automatic Face and Gesture Recognition(FG), Killington, Vermont, 1996, pp. 151.