2012 Second International Conference on 3D Imaging, Modeling, Processing, Visualization & Transmission (2012)
Zurich, Switzerland Switzerland
Oct. 13, 2012 to Oct. 15, 2012
This paper describes the computation of depth maps for a high-quality reference camera augmented by a set of satellite sensors. The satellite sensors include support cameras, a TOF (time-of-flight) sensor, and a thermal camera, all rigidly attached to the reference camera. There is extensive previous work on computing depth maps with stereo alone, and high-quality results have been achieved. However it has proved difficult to achieve good results for cases such as texture less areas, or similar fore- and background colors. We show that with our proposed sensor fusion we can achieve high quality results. The paper makes two contributions. The first is a method for combining TOF data with multi-camera data that includes reasoning about occlusions, to produce an improved depth estimate near depth discontinuities. The second contribution is to show the benefit of thermal sensing as a segmentation prior. Thermal cameras were formerly high-cost devices but are now available at the same cost as machine vision cameras. This work demonstrates their advantages, particularly for scenes including humans.
Sensor Fusion, Computer Vision, Image Processing
J. van Baar, P. Beardsley, M. Pollefeys and M. Gross, "Sensor Fusion for Depth Estimation, including TOF and Thermal Sensors," 2012 Second International Conference on 3D Imaging, Modeling, Processing, Visualization & Transmission(3DIMPVT), Zurich, Switzerland Switzerland, 2012, pp. 472-478.