This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Analysis of Camera Behavior During Tracking
August 1995 (vol. 17 no. 8)
pp. 765-778

Abstract—A camera is mounted on a moving robot and can rotate, relative to the robot, about two axes. We show how the optical flow field can be used to control the camera’s motion to keep a target at the center of the camera’s field of view, but that this is not always possible when the target lies close to the plane defined by the camera’s two axes of rotation.

When the target is held at the center of the camera’s field of view, then the magnitude of the camera’s angular velocity about one axis never exceeds the magnitude of the flow vector associated with the target, but the angular velocity about the other axis is dependent on the inverse distance of the target from this axis, and hence can become large as this distance becomes small. Situations, where the magnitudes of the camera’s angular velocity and acceleration become large, are considered in the special case where the relative motion between the robot and its environment is purely translational. The tracking strategy is experimentally evaluated using computer-generated optical flow fields.

[1] H.C. Longuet-Higgins and K. Prazdny,“The interpretation of a moving retinal image,” Proc. Royal Soc. London Series B, vol. 208, pp. 385-397, 1980.
[2] S.J. Maybank,“A theoretical study of optical flow,” PhD thesis, Dept. of Computer Science, Birkbeck College, Univ. of London, U.K., ch. 1, pp. 16-30, 1988.
[3] A. Verri and T. Poggio,“Motion field and optical flow: Differences and qualitative properties,” MIT AI Lab Memo 917, 1986.
[4] M.J. Swain and M.A. Stricker, “Promising Directions in Active Vision,” Int'l J. Computer Vision, vol. 11, no. 2, pp. 109-126, 1993.
[5] J. Aloimonos and A. Bandyopadhyay,“Active vision,” Proc. First Int’l Conf.Computer Vision (ICCV87), pp. 35-54, 1987.
[6] D. Coombs and C. Brown,“Real-time binocular smooth pursuit,” Int’l J. Computer Vision, vol. 11, no. 2, pp. 147-164, 1993.
[7] C. Fermüller and Y. Aloimonos, “The Role of Fixation in Visual Motion Analysis,” Int'l J. Computer Vision, vol. 11, no. 2, pp. 165-186, 1993.
[8] R. Bajcsy,“Active perception,” Proc. IEEE, vol. 76, no. 8, pp. 996-1,005, 1988.
[9] D.H. Ballard,“Eye movements and spatial cognition,” Univ. of Rochester Technical Report 218, 1987.
[10] D.H. Ballard, “Animate Vision,” Artificial Intelligence, vol. 48, pp. 57-86, 1991.
[11] D. Raviv and M. Herman,“A new approach to vision and control for road following,” Proc. IEEE Workshop Visual Motion, pp. 217-225,Nassau Inn, N.J., 1991.
[12] A.L. Yarbus,Eye Movement and Vision.New York: Plenum Press, 1967.
[13] D.A. Reece and S. Shafer,“Using active vision to simplify perception for robot driving,” Carnegie Mellon Technical Report CMU-CS-91-199, 1991.
[14] G. Sandini and V. Tagliasco., “An anthropomorphic retina-like structure for scene analysis,” Computer Vision, Graphics, Image Processing, vol. 14, no. 4, pp. 364-372, 1980.
[15] M. Tistarelli and G. Sandini,“Dynamic aspects in active vision,” CVGIP: Image Understanding, vol. 56, no. 1, pp. 108-192, 1992.
[16] M. Tistarelli and G. Sandini,“On the advantages of polar and log-polar mapping for the direct estimation of time-to-impact from optical flow,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 14, no. 4, pp. 401-410, 1993.
[17] B.K. Horn, Robot Vision. Cambridge, Mass.: MIT Press, 1986.
[18] C.M. Brown,“Gaze controls with interactions and delays,” IEEE Trans. Systems, Man, and Cybernetics, vol. 20, no. 1, pp. 518-527, 1990.
[19] K. Pahlavan and J. Eklundh, “A Head-Eye System—Analysis and Design,” CVGIP: Image Understanding, vol. 56, no. 1, July 1992.
[20] P.F. McLauchlan,I. Reid,, and D.W. Murray,“Coarse image motion for saccade control,” Proc. British Machine Vision Conf. (BMVC92), pp. 357-366, 1992.
[21] J. Aloimonos and D.P. Tsakiris,“On the visual mathematics of tracking,” Image and Vision Computing, vol. 9, no. 4, pp. 235-251, 1991.
[22] A.L. Abbott,“A survey of selective fixation control for machine vision,” IEEE Control Systems Magazine, vol. 12, no. 4, pp. 25-31, 1992.
[23] N. Franceschini,M. Pichon,, and C. Blanes,“From insect vision to robot vision,” Philosophical Trans. Royal Soc. London Series B, vol. 337, pp. 283-294, 1992.
[24] J. Craig, Introduction to Robotics: Mechanics and Control, Addison Wesley Longman, Reading, Mass., 1986.
[25] I.S. Sokolnikoff and R.M. Redheffer,Mathematics of Physics and Modern Eng.New York: McGraw-Hill, 1966, ch. 4, pp. 271-275.
[26] S.J. Maybank,“The angular velocity associated with the optical flow field due to a single moving rigid plane,” Proc. Sixth European Conf. Artificial Intelligence, T.O. O’Shea, ed., pp. 641-644,Pisa, Italy, 1984.
[27] A.S. Ramsey,Dynamics Part II.Cambridge, U.K.: Cambridge Univ. Press, 1944, ch. 3, pp. 59-62.
[28] S. Reddi,“Using optical flow for the recovery of motion parameters and for gaze control,” PhD thesis, Dept. of Computer Science, Birkbeck College, Univ. of London, U.K., 1993.

Index Terms:
Optical flow, gaze control, active vision.
Citation:
Swarup Reddi, George Loizou, "Analysis of Camera Behavior During Tracking," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 17, no. 8, pp. 765-778, Aug. 1995, doi:10.1109/34.400566
Usage of this product signifies your acceptance of the Terms of Use.