The Community for Technology Leaders
Green Image
Issue No. 03 - July-Sept. (2013 vol. 6)
ISSN: 1939-1412
pp: 257-267
F. Ryden , Dept. of Electr. Eng., Univ. of Washington, Seattle, WA, USA
H. J. Chizeck , Dept. of Electr. Eng., Univ. of Washington, Seattle, WA, USA
This paper presents a new haptic rendering method for streaming point cloud data. It provides haptic rendering of moving physical objects using data obtained from RGB-D cameras. Thus, real-time haptic interaction with moving objects can be achieved using noncontact sensors. This method extends "virtual coupling"-based proxy methods in a way that does not require preprocessing of points and allows for spatial point cloud discontinuities. The key ideas of the algorithm are iterative motion of the proxy with respect to the points, and the use of a variable proxy step size that results in better accuracy for short proxy movements and faster convergence for longer movements. This method provides highly accurate haptic interaction for geometries in which the proxy can physically fit. Another advantage is a significant reduction in the risk of "pop through" during haptic interaction with dynamic point clouds, even in the presence of noise. This haptic rendering method is computationally efficient; it can run in real time on available personal computers without the need for downsampling of point clouds from commercially available depth cameras.
Haptic interfaces, Rendering (computer graphics), Hip, Vectors, Real-time systems, Cameras, Force

F. Ryden and H. J. Chizeck, "A Proxy Method for Real-Time 3-DOF Haptic Rendering of Streaming Point Cloud Data," in IEEE Transactions on Haptics, vol. 6, no. 3, pp. 257-267, 2013.
187 ms
(Ver 3.3 (11022016))