This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Gesture-Based Programming for Robotics: Human-Augmented Software Adaptation
November/December 1999 (vol. 14 no. 6)
pp. 22-29
Gesture-based programming is a paradigm for programming robots by human demonstration in which the human demonstrator directs the self-adaptation of executable software. The goal is to provide a more natural environment for the user as programmer and to generate more complete and successful programs by focusing on task experts rather than programming experts. We call the paradigm "gesture-based" because we try to enable the system to capture, in real-time, the intention behind the demonstrator's fleeting, context-dependent hand motions, contact conditions, finger poses, and even cryptic utterances in order to reconfigure itself. The system is self-adaptive in the sense that knowledge of previously acquired skills (sensorimotor expertise) is retained by the system and this knowledge facilitates the interpretation of the gestures during training and then provides feedback control during runtime.
Citation:
Richard M. Voyles, J. Dan Morrow, Pradeep K. Khosla, "Gesture-Based Programming for Robotics: Human-Augmented Software Adaptation," IEEE Intelligent Systems, vol. 14, no. 6, pp. 22-29, Nov.-Dec. 1999, doi:10.1109/5254.809564
Usage of this product signifies your acceptance of the Terms of Use.