The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.01 - Jan.-March (2012 vol.5)
pp: 66-76
Timothy Edmunds , University of British Columbia, Vancouver
Dinesh K. Pai , University of British Columbia, Vancouver
ABSTRACT
Training simulators have proven their worth in a variety of fields, from piloting to air-traffic control to nuclear power station monitoring. Designing surgical simulators, however, poses the challenge of creating trainers that effectively instill not only high-level understanding of the steps to be taken in a given situation, but also the low-level “muscle-memory” needed to perform delicate surgical procedures. It is often impossible to build an ideal simulator that perfectly mimics the haptic experience of a surgical procedure, but by focussing on the aspects of the experience that are perceptually salient we can build simulators that effectively instill learning. We propose a general method for the design of surgical simulators that augment the perceptually salient aspects of an interaction. Using this method, we can increase skill-transfer rates without requiring expensive improvements in the capability of the rendering hardware or the computational complexity of the simulation. In this paper, we present our decomposition-based method for surgical simulator design, and describe a user-study comparing the training effectiveness of a haptic-search-task simulator designed using our method versus an unaugmented simulator. The results show that perception-based task decomposition can be used to improve the design of surgical simulators that effectively impart skill by targeting perceptually significant aspects of the interaction.
INDEX TERMS
Haptic I/O; artificial, augmented, and virtual realities; life and medical sciences; surgical simulation.
CITATION
Timothy Edmunds, Dinesh K. Pai, "Perceptually Augmented Simulator Design", IEEE Transactions on Haptics, vol.5, no. 1, pp. 66-76, Jan.-March 2012, doi:10.1109/TOH.2011.42
REFERENCES
[1] T. Edmunds and D.K. Pai, "Perceptual Rendering for Learning Haptic Skills," HAPTICS '08: Proc. Haptic Interfaces for Virtual Environment and Teleoperator Systems, pp. 225-230, 2008.
[2] T. Edmunds and D.K. Pai, "Perceptually Augmented Simulator Design through Decomposition," WHC '09: Proc. World Haptics Conf., pp. 505-510, 2009.
[3] PHANTOM Premium 1.0, SensAble Technologies, Inc., Woburn, MA, USA, http://www.sensable.comphantom-premium-1-0.htm , 2011.
[4] Maglev 200, Butterfly Haptics, LLC, Pittsburgh, PA, USA, http://butterflyhaptics.com/productssystem /, 2011.
[5] K.J. Kuchenbecker, J. Fiene, and G. Niemeyer, "Event-Based Haptics and Acceleration Matching: Portraying and Assessing the Realism of Contact," WHC '05: Proc. World Haptics Conf., pp. 381-387, 2005.
[6] D.C. Wightman and G. Lintern, "Part-Task Training for Tracking and Manual Control," Human Factors, vol. 27, no. 3, pp. 267-283, June 1985.
[7] R.L. Klatzky and S. Lederman, "Intelligent Exploration by the Human Hand," Dextrous Robot Hands, S.T. Venkataraman and T. Iberall, eds., pp. 66-81, Springer-Verlag, 1990.
[8] S.J. Lederman and R.L. Klatzky, "Relative Availability of Surface and Object Properties during Early Haptic Processing," J. Experimental Psychology: Human Perception and Performance, vol. 23, no. 6, pp. 1680-1707, 1997.
[9] G. Robles-De-La-Torre and V. Hayward, "Force Can Overcome Object Geometry in the Perception of Shape through Active Touch," Nature, vol. 412, pp. 445-448, 2001.
[10] D.K. Pai, K. van den Doel, D.L. James, J. Lang, J.E. Lloyd, J.L. Richmond, and S.H. Yau, "Scanning Physical Interaction Behavior of 3D Objects," Proc. SIGGRAPH, pp. 87-96, 2001.
[11] K.A. Purdy, S.J. Lederman, and R.L. Klatzky, "Haptic Processing of the Location of a Known Property: Does Knowing What You've Touched Tell You Where It Is?," Canadian J. Experimental Psychology, vol. 58, no. 1, pp. 32-45, 2004.
[12] K.E. Overvliet, J.B.J. Smeets, and E. Brenner, "Haptic Search with Finger Movements: Using More Fingers Does Not Necessarily Reduce Search Times," Experimental Brain Research, vol. 182, no. 3, pp. 427-434, Sept. 2007.
[13] A. Lecreuse and D.M. Fragaszy, "Hand Preferences for a Haptic Searching Task by Tufted Capuchins (Cebus Apella)," Int'l J. Primatology, vol. 17, no. 4, pp. 613-632, Aug. 1996.
[14] E. Gobbetti, M. Tuveri, G. Zanetti, and A. Zorcolo, "Catheter Insertion Simulation with Co-Registered Direct Volume Rendering and Haptic Feedback," MMVR '00: Proc. Medicine Meets Virtual Reality, pp. 96-98, 2000.
[15] B.J. Unger, A. Nicolaidis, P.J. Berkelman, A. Thompson, R.L. Klatzky, and R.L. Hollis, "Comparison of 3-D Haptic Peg-in-Hole Tasks in Real and Virtual Environments," Proc. IEEE/RSJ Int'l Conf. Intelligent Robots and Systems, pp. 1751-1756, 2001.
[16] S.E. Salcudean and T.D. Vlaar, "On the Emulation of Stiff Walls and Static Friction with a Magnetically Levitated Input/Output Device," ASME J. Dynamic Systems, Measurement and Control, vol. 119, no. 1, pp. 127-132, 1997.
[17] D. Constantinescu, S.E. Salcudean, and E.A. Croft, "Haptic Rendering of Rigid Contacts Using Impulsive and Penalty Forces," IEEE Trans. Robotics, vol. 21, no. 3, pp. 309-323, June 2005.
[18] J.D. Hwang, M.D. Williams, and G. Niemeyer, "Toward Event-Based Haptics: Rendering Contact Using Open-Loop Force Pulses," HAPTICS '04: Proc. 12th Int'l Symp. Haptic Interfaces for Virtual Environment and Teleoperator Systems, pp. 24-31, 2004.
[19] C.A. Baker, D.F. Morris, and W.C. Steedman, "Target Recognition on Complex Displays," Human Factors, vol. 2, pp. 51-61, 1960.
[20] G. Campion and V. Hayward, "On the Synthesis of Haptic Textures," IEEE Trans. Robotics, vol. 24, no. 3, pp. 527-536, June 2008.
[21] R. Klatzky and S. Lederman, "Perceiving Texture through a Probe," Touch in Virtual Environments, M.L. McLaughlin, J.P. Hespanha, and G.S. Sukhatme, eds., ch. 10, pp. 182-195, Prentice Hall PTR, 2002.
118 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool