This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
A Kalman Filter Approach to Direct Depth Estimation Incorporating Surface Structure
June 1999 (vol. 21 no. 6)
pp. 570-575

Abstract—The problem of depth-from-motion using a monocular image sequence is considered. A pixel-based model is developed for direct depth estimation within a Kalman filtering framework. A method is proposed for incorporating local surface structure into the Kalman filter. Experimental results are provided to illustrate the effect of structural information on depth estimation.

[1] G. Adiv, "Determining Three-Dimensional Motion and Structure From Optical Flow Generated by Several Moving Objects," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 7, no. 4, pp. 384-401, 1985.
[2] J.Y. Aloimonos, I. Weiss, and A. Bandyopadhyay, "Active Vision," Int'l J. Computer Vision, vol. 1, pp. 333-356, 1987.
[3] Y. Aloimonos and Z. Duric, "Estimating the Heading Direction Using Normal Flow," Int'l J. Computer Vision, pp. 33-56, 1994.
[4] P.R. Belanger, "Estimation of Noise Covariance Matrices for a Linear Time-Varying Stochastic Process," Automatica, vol. 10, pp. 267-274, 1974.
[5] J. Heel, "Direct Dynamic Motion Vision," Proc. IEEE Conf. Robotics and Automation, pp. 1,142-1,147, 1990.
[6] B.K.P. Horn and E.J. Weldon, "Direct Methods for Recovering Motion," Int'l J. Computer Vision, vol. 2, pp. 51-76, 1988.
[7] L. Huang and Y. Aloimonos, "How Normal Flow Constrains Relative Depth for an Active Observer," Image and Vision Computing, pp. 435-445, 1994.
[8] L. Matthies, T. Kanade, and R. Szeliski, "Kalman Filter-Based Algorithms for Estimating Depth From Image Sequences," Int'l J. Computer Vision, vol. 3, pp. 209-236, 1989.
[9] H. H. Nagel,W. Enkelmann,“An Investigation of smoothness constraints for the estimation of displacement vector fields from image sequences,” Trans. Pattern Analysis and Machine Intelligence, vol. 8, no. 5, pp. 565-593, 1986
[10] S. Negahdaripour and B.K.P. Horn, “Direct Passive Navigation,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 9, no. 1, pp. 168-176, Jan. 1987.
[11] K. Prazdny, "Determining the Instantaneous Direction of Motion From Optical Flow Generated by a Curvilinearly Moving Observer," Computer Vision, Graphics and Image Processing, vol. 17, pp. 238-248, 1981.
[12] G. Sandini and M. Tistarelli,“Active tracking strategy for monocular depth inference over multiple frames,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 12, no. 1, pp. 13-27, 1990.
[13] D. Sinclair, A. Blake, and D. Murray, "Robust Estimation of Egomotion From Normal Flow," Int'l J. Computer Vision, pp. 57-69, 1994.
[14] R.Y. Tsai and T.S. Huang, "Uniqueness and Estimation of Three-Dimensional Motion Parameters of Rigid Objects With Curved Surfaces," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 6, no. 1, pp. 337-351, 1984.
[15] A.M. Waxman, B. Kamgar-Parsi, and M. Subbarao, "Closed-Form Solutions to Image Flow Equations for 3D Structure and Motion," Int'l J. Computer Vision, vol. 1, pp. 239-258, 1987.
[16] J. Weng,T. S. Huang,, and N. Ahuja,Motion and Structure from Image Sequences, Springer Series on Information Sciences. Berlin: Springer-Verlag, 1993.
[17] Y. Xiong and S.A. Shafer, "Dense Structure From a Dense Optical Flow Sequence," Int'l Symp. Computer Vision, pp. 1-6,Coral Gables, Fla., 1995 (also Carnegie Mellon University Technical Report CMU-RI-TR-95-10).
[18] H. Zhuang, R. Sudhakar, and J. Shieh, "Depth Estimation From a Sequence of Monocular Images With Known Camera Motion," Robotics and Autonomous Systems, vol. 13, pp. 87-95, 1994.

Index Terms:
Depth-from-motion, Kalman filter, gradient method, surface structure, image sequence.
Citation:
Y.s. Hung, H.t. Ho, "A Kalman Filter Approach to Direct Depth Estimation Incorporating Surface Structure," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 21, no. 6, pp. 570-575, June 1999, doi:10.1109/34.771330
Usage of this product signifies your acceptance of the Terms of Use.