The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.07 - July (2011 vol.33)
pp: 1400-1414
Liang Wang , University of Kentucky, Lexington
Ruigang Yang , University of Kentucky, Lexington
Jiejie Zhu , University of Kentucky, Lexington
Zhigeng Pan , Zhejiang University, Hangzhou
ABSTRACT
Time-of-flight range sensors have error characteristics, which are complementary to passive stereo. They provide real-time depth estimates in conditions where passive stereo does not work well, such as on white walls. In contrast, these sensors are noisy and often perform poorly on the textured scenes where stereo excels. We explore their complementary characteristics and introduce a method for combining the results from both methods that achieve better accuracy than either alone. In our fusion framework, the depth probability distribution functions from each of these sensor modalities are formulated and optimized. Robust and adaptive fusion is built on a pixel-wise reliability weighting function calculated for each method. In addition, since time-of-flight devices have primarily been used as individual sensors, they are typically poorly calibrated. We introduce a method that substantially improves upon the manufacturer's calibration. We demonstrate that our proposed techniques lead to improved accuracy and robustness on an extensive set of experimental results.
INDEX TERMS
Time-of-Flight sensor, multisensor fusion, global optimization, stereo vision.
CITATION
Liang Wang, Ruigang Yang, Jiejie Zhu, Zhigeng Pan, "Reliability Fusion of Time-of-Flight Depth and Stereo Geometry for High Quality Depth Maps", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.33, no. 7, pp. 1400-1414, July 2011, doi:10.1109/TPAMI.2010.172
REFERENCES
[1] "Canesta Inc., Canestavision Electronic Perception Development Kit," http:/www.canesta.com/, 2006.
[2] "Swissranger Inc., sr-2," http://www.csem.ch/fsimaging.htm, 2006.
[3] "3dv Systems, z-cam," http:/www.3dvsystems.com, 2004.
[4] "Photonix Mixer Device for Distance Measurement," http:/www.pmdtec.com, 2001.
[5] D. Scharstein and R. Szeliski, "A Taxonomy and Evaluation of Dense Two-Frame Stereo Correspondence Algorithms," Int'l J. Computer Vision, vol. 47, no. 1, pp. 7-42, 2002.
[6] A. Verri and V. Torre, "Absolute Depth Estimate in Stereopsis," J. Optical Soc. of Am. A, vol. 3, pp. 297-299, 1986.
[7] Z. Xu, R. Schwarte, H. Heinol, B. Buxbaum, and T. Ringbeck, "Smart Pixel-Hptonic Mixer Device (PMD)," Proc. Int'l Conf. Mechatron and Machine Vision, pp. 259-264, 1998.
[8] M. Lehmann, R. Kaufmann, F. Lustenberger, B. Büttgen, and T. Oggier, "CCD/CMOS Lock-In Pixel for Range Imaging: Challenges, Limitations and State-of-the-Art," CSEM, Swiss Center for Electronics and Microtech nology, 2004.
[9] T. Oggier, B. Büttgen, F. Lustenberger, G. Becker, B. Rüegg, and A. Hodac, "Swissranger SR3000 and First Experiences Based on Miniaturized 3D-ToF Cameras," Proc. Conf. First Range Imaging Research Day, 2005.
[10] G.J. Iddan and G. Yahav, "3D Imaging in the Studio," Proc. SPIE Conf., pp. 48-56, 2001.
[11] R. Lange and P. Seitz, "Solid-State Time-of-Flight Range Camera," IEEE J. Quantum Electronics, vol. 37, no. 3, pp. 390-397, Mar. 2001.
[12] S.Y. Chen, Active Sensor Planning for Multiview Vision Tasks. Springer, 2008.
[13] S.Y. Chen, Y.F. Li, and J.W. Zhang, "Vision Processing for Realtime 3D Data Acquisition Based on Coded Structured Light," IEEE Trans. Image Processing, vol. 17, no. 2, pp. 167-176, Feb. 2008.
[14] S.A. Gudmundsson, H. Aanæs, and R. Larsen, "Environmental Effects on Measurement Uncertainties of Time-of-Flight Cameras," Proc. Int'l Symp. Signals, Circuits, and Systems, pp. 1-4, 2007.
[15] T. Kahlmann, F. Remondino, and H. Ingensand, "Calibration of the Fast Range Imaging Camera Swissranger for Use in the Surveillance of the Environment," Proc. SPIE Conf. Electro-Optical Remote Sensing II, 2006.
[16] H. Gonzales-Banos and J. Davis, "Computing Depth Under Ambient Illumination Using Multi-Shuttered Light," Proc. IEEE Int'l Conf. Computer Vision and Pattern Recoginition, pp. 234-241, 2004.
[17] M. Lindner and A. Kolb, "Calibration of the Intensity-Related Distance Error of the PMD ToF-Camera," Proc. SPIE Conf. Intelligent Robots and Computer Vision XXV, 2007.
[18] S. Fuchs and G. Hirzinger, "Extrinsic and Depth Calibration of ToF-Cameras," Proc. IEEE Conf. Computer Vision and Pattern Recognition, pp. 464-468, 2008.
[19] B. Huhle, T. Schairer, P. Jenke, and W. Straber, "Robust Non-Local Denoising of Colored Depth Data," Proc. First Workshop Time-of-Flight Based Computer Vision, 2008.
[20] S.B. Göktürk, H. Yalcin, and C. Bamji, "A Time-of-Flight Depth Sensor: System Description, Issues, and Solutions," Proc. IEEE Int'l Conf. Computer Vision and Pattern Recoginition, p. 35, 2004.
[21] Q.X. Yang, R.G. Yang, J. Davis, and D. Nister, "Spatial-Depth Super Resolution for Range Images," Proc. IEEE Int'l Conf. Computer Vision and Pattern Recognition, 2007.
[22] S. Schuon, C. Theobalt, J. Davis, and S. Thrun, "High-Quality Scanning Using Time-of-Flight Depth Superresolution," Proc. First Workshop Time-of-Flight Based Computer Vision, 2008.
[23] J. Diebel and S. Thrun, "An Application of Markov Random Fields to Range Sensing," Advances in Neural Information Processing Systems, pp. 291-298, MIT Press, 2005.
[24] A.H. Izhal, T. Ushinaga, T. Sawada, M. Homma, Y. Maeda, and S. Kawahito, "A Cmos Time-of-Flight Range Image Sensor with Gates-on-Field-oxide Structure," IEEE Sensors J., vol. 7, no. 12, pp. 1578-1586, Dec. 2007.
[25] M. Lindner, A. Kolb, and K. Hartmann, "Data-Fusion of PMD-Based Distance-Information and High-Resolution RGB-Images," Proc. Int'l Symp. Signals, Circuits, and Systems, 2007.
[26] R. Reulke, "Combination of Distance Data with High Resolution Images," Proc. Conf. Image Engineering and Vision Metrology, 2006.
[27] T.D.A. Prasad, K. Hartmann, W. Weihs, S.E. Ghobadi, and A. Sluiter, "First Steps in Enhancing 3D Vision Technique Using 2D/3D Sensors," Proc. Computer Vision Winter Workshop, pp. 82-86, 2006.
[28] C. Beder, B. Bartczak, and R. Koch, "A Comparison of PMD-Cameras and Stereo-Vision for the Task of Surface Reconstruction Using Patchlets," Proc. IEEE Int'l Conf. Computer Vision and Pattern Recoginition, 2007.
[29] S.A. Gudmundsson, H. Aanæs, and R. Larsen, "Fusion of Stereo Vision and Time-of-Flight Imaging for Improved 3D Estimation," Proc. Int'l Workshop Dynamic 3D Imaging, 2007.
[30] S. May, B. Werner, H. Surmann, and K. Pervölz, "3D Time-of-Flight Cameras for Mobile Robotics," Proc. IEEE/RSJ Int'l Conf. Intelligent Robots and Systems, pp. 1578-1586, 2006.
[31] L. Guan and M. Pollefeys, "A Unified Approach to Calibrate a Network of Camcorders and ToF Cameras," Proc. IEEE Workshop Multi-Camera and Multi-Model Sensor Fusion Algorithms and Applications, 2008.
[32] Z. Zhang, "A Flexible New Technique for Camera Calibration," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 22, no. 11, pp. 1330-1334, Nov. 2000.
[33] R. Hartley and A. Zisserman, Multiple View Geometry in Computer Vision, second ed. Cambridge Univ. Press, 2003.
[34] B.K.P. Horn, "Closed-Form Solution of Absolute Orientation Using Unit Quaternions," J. Optical Soc. of Am., vol. 4, no. 4, pp. 629-642, 1987.
[35] V. Kolmogorov and R. Zabih, "Computing Visual Correspondence with Occlusions Using Graph Cuts," Proc. IEEE Int'l Conf. Computer Vision, pp. 508-515, 2001.
[36] J. Sun, Y. Liy, S.B. Kang, and H.Y. Shum, "Symmetric Stereo Matching for Occlusion Handling," Proc. IEEE Int'l Conf. Computer Vision and Pattern Recoginition, pp. 399-406, 2005.
[37] Q.X. Yang, L. Wang, R. G.Yang, S. G.Wang, M. Liao, and D. Nister, "Real-Time Global Stereo Matching Using Hierarchical Belief Propagation," Proc. British Machine Vision Conf., 2006.
[38] J.J. Zhu, L. Wang, J.Z. Gao, and R.G. Yang, "Spatial-Temporal Fusion for High Accuracy Depth Maps Using Dynamic MRFs," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 32, no. 5, pp. 899-909, May 2010.
[39] Q. Yang, L. Wang, R. G.Yang, H. Stewenius, and D. Nister, "Stereo Matching with Color-Weighted Correlation, Hierarchical Belief Propagation and Occlusion Handling," Proc. IEEE Int'l Conf. Computer Vision and Pattern Recoginition, pp. 2347-2354, 2006.
[40] S. Birchfield and C. Tomasi, "A Pixel Dissimilarity Measure That Is Insensitive to Image Sampling," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 20, no. 4, pp. 401-406, Apr. 1998.
[41] K.J. Yoon and I.S. Kweon, "Locally Adaptive Support-Weight Approach for Visual Correspondence Search," Proc. IEEE Int'l Conf. Computer Vision and Pattern Recognition, pp. 924-931, 2005.
[42] J. Zhu, L. Wang, R. Yang, and J. Davis, "Fusion of Time-of-Flight Depth and Stereo for High Accuracy Depth Maps," Proc. IEEE Conf. Computer Vision and Pattern Recognition, 2008.
[43] L. Wang, L.H. Jin, and R.G. Yang, "Search Space Reduction for MRF Stereo," Proc. 10th European Conf. Computer Vision, 2008.
[44] Y.J. Zheng, S. Lin, and S.B. Kang, "Single-Image Vignetting Correction," Proc. IEEE Conf. Computer Vision and Pattern Recognition, pp. 461-468, 2006.
[45] A. Brunton, C. Shu, and G. Roth, "Belief Propagation on the GPU for Stereo Vision," Proc. Third Canadian Conf. Computer and Robot Vision, 2006.
[46] Q.X. Yang, L. Wang, R.G. Yang, S.N. Wang, M. Liao, and D. Nister, "Real-Time Global Stereo Matching Using Hierarchical Belief Propagation," Proc. British Machine Vision Conf., 2006.
[47] C.K. Liang, C.C. Cheng, Y.C. Lai, L.G. Chen, and H.H. Chen, "Hardware-Efficient Belief Propagation," Proc. IEEE Conf. Computer Vision and Pattern Recognition, 2009.
17 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool