The Community for Technology Leaders
Green Image
Issue No. 07 - July (2011 vol. 33)
ISSN: 0162-8828
pp: 1400-1414
Jiejie Zhu , University of Kentucky, Lexington
Liang Wang , University of Kentucky, Lexington
Ruigang Yang , University of Kentucky, Lexington
James E. Davis , University of California, Santa Cruz
Zhigeng Pan , Zhejiang University, Hangzhou
Time-of-flight range sensors have error characteristics, which are complementary to passive stereo. They provide real-time depth estimates in conditions where passive stereo does not work well, such as on white walls. In contrast, these sensors are noisy and often perform poorly on the textured scenes where stereo excels. We explore their complementary characteristics and introduce a method for combining the results from both methods that achieve better accuracy than either alone. In our fusion framework, the depth probability distribution functions from each of these sensor modalities are formulated and optimized. Robust and adaptive fusion is built on a pixel-wise reliability weighting function calculated for each method. In addition, since time-of-flight devices have primarily been used as individual sensors, they are typically poorly calibrated. We introduce a method that substantially improves upon the manufacturer's calibration. We demonstrate that our proposed techniques lead to improved accuracy and robustness on an extensive set of experimental results.
Time-of-Flight sensor, multisensor fusion, global optimization, stereo vision.

L. Wang, J. E. Davis, R. Yang, J. Zhu and Z. Pan, "Reliability Fusion of Time-of-Flight Depth and Stereo Geometry for High Quality Depth Maps," in IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 33, no. , pp. 1400-1414, 2010.
82 ms
(Ver 3.3 (11022016))