Salt Lake City, UT, USA
Mar. 18, 2009 to Mar. 20, 2009
Joel C. Huegel , Mechanical Engineering and Materials Science Department, Rice University, USA
Marcia K. O'Malley , Mechanical Engineering and Materials Science Department, Rice University, USA
The objective of this work is to demonstrate that progressive haptic guidance can accelerate and improve motor task training outcomes over visual or practice-only methods in a training virtual environment (TVE). To that end, we design haptic and visual guidance schemes based on detailed analyses of performance differences between experts and novice trainees performing a dynamic motor control task in a TVE. Research shows that TVEs that include haptic interfaces produce significant short-term performance gains over audio visual TVEs. However, improved or accelerated long-term training outcomes for dynamic tasks have yet to be demonstrated due, at least in part, to the increasing dependence of the trainee on the assistance. To avoid this dependence, a progressive guidance controller can gradually remove the assistance as the performance of the trainee improves. However, the inputs to the guidance controller must be based on measurements of performance in the necessary kinematic and dynamic components of the task. Prior work introduced two such quantitative performance measures. We implement similar measures, trajectory error and input frequency, to identify and classify performance levels, thereby providing valid and robust inputs to the guidance controller. We demonstrate that progressive haptic guidance schemes can improve training in a dynamic task.
Joel C. Huegel, Marcia K. O'Malley, "Visual versus haptic progressive guidance for training in a virtual dynamic task", WHC, 2009, World Haptics Conference, World Haptics Conference 2009, pp. 399-400, doi:10.1109/WHC.2009.4810914