This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
A Proxy Method for Real-Time 3-DOF Haptic Rendering of Streaming Point Cloud Data
July-Sept. 2013 (vol. 6 no. 3)
pp. 257-267
Fredrik Ryden, University of Washington, Seattle
Howard Jay Chizeck, University of Washington, Seattle
This paper presents a new haptic rendering method for streaming point cloud data. It provides haptic rendering of moving physical objects using data obtained from RGB-D cameras. Thus, real-time haptic interaction with moving objects can be achieved using noncontact sensors. This method extends "virtual coupling"-based proxy methods in a way that does not require preprocessing of points and allows for spatial point cloud discontinuities. The key ideas of the algorithm are iterative motion of the proxy with respect to the points, and the use of a variable proxy step size that results in better accuracy for short proxy movements and faster convergence for longer movements. This method provides highly accurate haptic interaction for geometries in which the proxy can physically fit. Another advantage is a significant reduction in the risk of "pop through" during haptic interaction with dynamic point clouds, even in the presence of noise. This haptic rendering method is computationally efficient; it can run in real time on available personal computers without the need for downsampling of point clouds from commercially available depth cameras.
Index Terms:
Haptic interfaces,Rendering (computer graphics),Hip,Vectors,Real-time systems,Cameras,Force,point cloud velocity estimation,Haptic rendering,streaming point cloud data
Citation:
Fredrik Ryden, Howard Jay Chizeck, "A Proxy Method for Real-Time 3-DOF Haptic Rendering of Streaming Point Cloud Data," IEEE Transactions on Haptics, vol. 6, no. 3, pp. 257-267, July-Sept. 2013, doi:10.1109/TOH.2013.20
Usage of this product signifies your acceptance of the Terms of Use.