This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Direct Recovery of Motion and Shape in the General Case by Fixation
August 1992 (vol. 14 no. 8)
pp. 847-853

A direct method called fixation is introduced for solving the general motion vision problem: arbitrary motion relative to an arbitrary environment. This method results in a linear constraint equation that explicitly expresses the rotational velocity in terms of the translational velocity. The combination of this constraint equation with the brightness-change constraint equation solves the general motion vision problem. Avoiding correspondence and optical flow has been the motivation behind this direct method, which uses the image brightness information such as temporal and spatial brightness gradients directly. In contrast with previous direct methods, the fixation method does not put any severe restrictions on the motion or the environment. Moreover, the fixation method neither requires tracked images as its input nor uses tracking for obtaining fixated images. Instead, it introduces a pixel shifting process to construct fixated images for any arbitrary fixation point. This is done entirely in software without any use of camera motion for tracking.

[1] G. Adiv, "Determining 3-D motion and structure from optical flow generated by several moving objects,"IEEE Trans. Patt. Anal. Machine Intell., vol. PAMI-7, no. 4, pp. 384-401, 1985.
[2] J. Aloimonos and A. Basu, "Determining the translation of a rigidly moving surface, without correspondence." inProc. IEEE Conf. Comput. Vision Patt. Recogn.(Miami, FL), June 1986.
[3] J. Aloimonos and D. Shulman,Integration of Visual Modules: An Extension of the Marr Paradigm. Boston, MA: Academic, 1989.
[4] J. Aloimonos and D. P. Tsakiris, "On the mathematics of visual tracking," Comput. Vision Lab., Univ. of Maryland, no. CAR-TR-390, Sept. 1988.
[5] A. T. Bahil and T. LaRitz, "Why can't batters keep their eyes on the ball,"Amer. Sci., vol. 72, pp. 219-253, 1984.
[6] A. T. Bahil and J. D. McDonald, "Model emulates human smooth pursuit system producing zero-latency target tracking,"Biol. Cybern., vol. 48, pp. 213-222, 1983.
[7] D. H. Ballard and O. A. Kimball, "Rigid body motion from depth and optical flow."Comput. Vision Graphics Image Processing, vol. 22, no. 1, Apr. 1983.
[8] A. Bandopadhay, "A computational study of rigid motion perception," Dept. of Comput. Sci., Univ. of Rochester, 1986.
[9] A. Bandopadhay, B. Chandra, and D. H. Ballard, "Active navigation: Tracking an environmental point considered beneficial," inProc. IEEE Workshop Motion: Representation Anal.(Kiawash Island), May 7-9, 1986, pp. 23-29.
[10] J. Barron, "A survey of pattern recognition approaches for determining optical flow, Environmental layout and egomotion," Univ. of Toronto, no. RBCV-TR-84-5, Nov. 1984.
[11] A. R. Bruss and B. K. P. Horn, "Passive navigation,"Comput. Vision Graphics Image Processing. vol. 21, no. 1, pp. 3-20, Jan. 1983.
[12] N. Cornelius and T. Kanade, "Adapting optical-flow to measure object motion in reflectance and x-ray image sequences," Carnegie-Mellon Univ., no. CMU-CS-83-119, Jan. 1983.
[13] M. A. Gennert and S. Negahdaripour, "Relaxing the brightness constancy assumption in computing optical flow," Artificial Intell. Lab., Mass. Inst. Technol., MIT AI Memo 975, June 1987.
[14] B. Hallert,Photogrammetry. New York: McGraw-Hill, 1960.
[15] E. Hildreth,The Measurement of Visual Motion, Cambridge, MA: MIT Press, 1983.
[16] B. K. P. Horn and B. G. Schunck, "Determining optical flow,"Artificial Intell., vol. 17, pp. 185-203, 1981.
[17] B. K. P. Horn and E. J. Weldon, Jr., "Direct methods for recovering motion,"Int. J. Comput. Vision, vol. 2, pp. 51-76, 1988.
[18] H. C. Longuet-Higgins, "An IEEE computer algorithm for reconstructing a scene from two projections,"Nature, vol. 293, pp. 133-135, 1981.
[19] H. C. Longuet-Higgins and K. Prazdny, "The interpretation of a moving retinal image,"Proc. Royal Soc. London B, pp. 385-397, 1980.
[20] F. Moffit and E. M. Mikhail,Photogrammetry. New York: Harper and Row, 1980.
[21] S. Negahdaripour and B. K. P. Horn, "Direct passive navigation,"IEEE Trans. Patt. Anal. Machine Intell., vol. 9, no. 1, pp. 168-176, Jan 1987.
[22] S. Negahdaripour, A. Shokrollahi, and M. Gennert, "Relaxing the brightness constancy assumption in computing optical flow," inProc. Int. Conf. Image Processing(Singapore), Sept. 1989, pp. 806-810.
[23] K. Prazdny, "Motion and structure from optical flow," inProc. Sixth Int. Joint Conf. Artificial Intell.(Tokyo), Aug. 1979.
[24] G. Sandini and M. Tistarelli, "Active tracking strategy for monocular depth inference over multiple frames,"IEEE Trans. Patt. Anal. Machine Intell., vol. 12, no. 1, pp. 13-27, Jan 1990.
[25] M. A. Taalebinezhaad, "Partial implementation of fixation method on real images," inProc. IEEE Conf. Computer Vision Pattern Recognition(Maui, Hawaii), June 1991.
[26] W. B. Thompson, "Structure-from-motion by tracking occlusion boundaries,"Biol. Cybern., vol. 62, pp. 113-116, 1989.
[27] M. Taalebinezhaad,The Interpretation of Visual Motion. Cambridge, MA: MIT Press, 1979.
[28] M. Taalebinezhaad, "The effect of similarity between line segments on the correspondence strength in apparent motion,"Perception, vol. 9, pp. 617-626, 1980.
[29] A. M. Waxman and K. Wohn,Image Flow Theory: A Frame Work for 3-D Inference from Time-Varying Imagery, Vol. 1, Advances in Computer Vision(C. Brown, Ed). Boston: Lawrence Erlbaum, 1988, pp. 165-224.

Index Terms:
temporal brightness gradients; shape recovery; motion recovery; fixation; motion vision; linear constraint equation; rotational velocity; translational velocity; brightness-change constraint equation; image brightness; spatial brightness gradients; tracking; brightness; computer vision; pattern recognition; tracking
Citation:
M.A. Taalebinezhaad, "Direct Recovery of Motion and Shape in the General Case by Fixation," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 14, no. 8, pp. 847-853, Aug. 1992, doi:10.1109/34.149584
Usage of this product signifies your acceptance of the Terms of Use.