This Article 
 Bibliographic References 
 Add to: 
Pose Estimation by Fusing Noisy Data of Different Dimensions
February 1995 (vol. 17 no. 2)
pp. 195-201

Abstract— A method for fusing and integrating different 2D and 3D measurements for pose estimation is proposed. The 2D measured data is viewed as 3D data with infinite uncertainty in particular directions. The method is implemented using Kalman filtering. It is robust and easily parallelizable.

[1] B. Sabata and J.K. Aggarwal,“Estimation of motion from a pair of range images: A review,” Computer Vision, Graphics, and Image Processing, vol. 54, no. 3, pp. 309-324, Nov. 1991.
[2] R.M. Haralick, H. Joo, C.-N. Lee, X. Zhuang, and M.B. Kim, “Pose Estimation from Corresponding Point Data,” IEEE Trans. Systems, Man, and Cybernetics, vol. 19, no. 6, p. 1426, 1989.
[3] R. Tsai, "A Versatile Camera Calibration Technique for High-Accuracy 3D Machine Vision Metrology Using Off-the-Shelf TV Cameras and Lenses," IEEE J. Robotics and Automation, vol. 3, no. 4, pp. 323-344, Aug. 1987.
[4] J.S.C. Yuan, “A General Phogrammetric Solution for the Determining Object Position and Orientation,” IEEE Trans. Robotics and Automation, vol. 5, no. 2, pp. 129–142, Apr. 1989.
[5] M.A. Fischler and R.C. Bolles, “Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography,” Graphics and Image Processing, vol. 24, no. 6, pp. 381–395, June 1981.
[6] O.D. Gaugeras and M. Hebert,“The representation, recognition, and positioning of 3D shapes from range data,” Techniques for 3D Machine Perception, A. Rosenfeld, ed., pp. 13-51, Elsevier Science, 1986.
[7] B.K.P. Horn,“Closed-form solution of absolute orientation using unit quaternion,” J. Optical Soc. Am., vol. 4, no. 4, pp. 629-642, 1987.
[8] K.S. Arun, T.S. Huang, and S.D. Blostein, "Least Squares Fitting of Two 3-(D) Point Sets," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 9, pp. 698-700, 1987.
[9] S.D. Blostein and T.S. Huang,“Estimating 3D motion from range data,” Conf. on Artificial Intelligence Applications, pp. 246-250, 1984.
[10] Z.C. Lin,T.S. Huang,S.D. Blostein,H. Lee,, and E.A. Margerum,“Motion estimation from 3D point sets with and without correspondences,” Proc. Conf. Computer Vision and Pattern Recognition, pp. 194-201, 1986.
[11] Y. Liu,T.S. Huang,, and O.D. Faugeras,“Determination of camera location from 2D to 3D line and point correspondences,” Proc. Conf. Computer Vision and Pattern Recognition, pp. 82-88, 1988.
[12] W.E.L. Grimson and T. Lozano-Perez,“Model-based recognition and localization from sparse range or tactile data,” Int’l J. Robotics Research, vol. 3, no. 3, pp. 3-35, 1984.
[13] R. Kumar,Model Dependent Inference of 3D Information from a Sequence of 2D Images, Ph.D. thesis, Univ. of Massachusetts at Amherst, Feb. 1992.
[14] B.K. Horn, Robot Vision. Cambridge, Mass.: MIT Press, 1986.
[15] Y. Hel-Or,M. Shmuel,, and M. Werman,“Active feature localization,” Active Perception and Robot Vision,Springer Verlag, 1991
[16] A.H. Jazwinski,Stochastic Process and Filtering Theory, Academic Press, 1970.
[17] P.S. Maybeck,Stochastic Models, Estimation, and Control, vol. 1, Academic Press, 1979.
[18] O.D. Faugeras,N. Ayache,, and B. Faverjon,“A geometric matcher for recognizing and positioning 3D rigid objects,” Conf. on Artificial Intelligence Applications, pp. 218-224, 1984.
[19] Y. Hel-Or,Pose Estimation from Uncertain Sensory Data, Ph.D. thesis, Inst. Computer Science, Hebrew Univ. of Jerusalem, 1993.
[20] A. Shmuel and M. Werman,“Active vision: 3D depth from an image sequence,” Proc. Int’l Conf. Pattern Recognition, pp. 48-54, 1990.
[21] L. Matthies and T. Kanade,“The cycle of uncertainty and constraint in robot perception,” Int’l J. Robotics Research, vol. 4, 1987.

Index Terms:
Sensor fusion, Kalman filter, pose estimation, model based, object recognition.
Yacov Hel-Or, Michael Werman, "Pose Estimation by Fusing Noisy Data of Different Dimensions," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 17, no. 2, pp. 195-201, Feb. 1995, doi:10.1109/34.368169
Usage of this product signifies your acceptance of the Terms of Use.