The Community for Technology Leaders
RSS Icon
Subscribe
Issue No.05 - May (2012 vol.34)
pp: 1024-1031
Chris McCarthy , Australian National University, Canberra
Nick Barnes , Australian National University, Canberra
ABSTRACT
We present a new visual control input from optical flow divergence enabling the design of novel, unified control laws for docking and landing. While divergence-based time-to-contact estimation is well understood, the use of divergence in visual control currently assumes knowledge of surface orientation, and/or egomotion. There exists no directly observable visual cue capable of supporting approaches to surfaces of arbitrary orientation under general motion. Central to our measure is the use of the maximum flow field divergence on the view sphere (max-div). We prove kinematic properties governing the location of max-div, and show that max-div provides a temporal measure of proximity. From this, we contribute novel control laws for regulating both approach velocity and angle of approach toward planar surfaces of arbitrary orientation, without structure-from-motion recovery. The strategy is tested in simulation, over real image sequences and in closed-loop control of docking/landing maneuvers on a mobile platform.
INDEX TERMS
Robot vision, visuo-motor control, visual navigation, optical flow.
CITATION
Chris McCarthy, Nick Barnes, "A Unified Strategy for Landing and Docking Using Spherical Flow Divergence", IEEE Transactions on Pattern Analysis & Machine Intelligence, vol.34, no. 5, pp. 1024-1031, May 2012, doi:10.1109/TPAMI.2012.27
REFERENCES
[1] M.V. Srinivasan, S.W. Zhang, J.S. Chahl, E. Barth, and S. Venkatesh, "How Honeybees Make Grazing Landings on Flat Surfaces," Biological Cybernetics, vol. 83, pp. 171-183, 2000.
[2] H. Wagner, "Flow-Field Variables Trigger Landing in Flies," Nature, vol. 297, pp. 147-148, 1982.
[3] D.N. Lee, M.N.O. Davies, P.R. Green, and F.R. van der Weel, "Visual Control of Velocity of Approach by Pigeon When Landing," J. Experimental Biology, vol. 180, pp. 85-104, 1993.
[4] D.N. Lee, "A Theory of Visual Control of Braking Based on Information about Time to Collision," Perception, vol. 5, no. 4, pp. 437-459, 1976.
[5] F.C. Rind, "Collision Avoidance: From the Locust Eye to a Seeing Machine," From Living Eyes to Seeing Machines, M.V. Srinivasan and S. Venkatesh, eds., pp. 105-125, Oxford Univ. Press, 1997.
[6] R.M. Robertson and A.G. Johnson, "Collision Avoidance of Flying Locusts: Steering Torques and Behaviour," J. Experimental Biology, vol. 183, pp. 35-60, 1993.
[7] R.C. Nelson and J.Y. Alloimonos, "Obstacle Avoidance Using Flow Field Divergence," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 11, no. 10, pp. 1102-1106, Oct. 1989.
[8] N. Ancona and T. Poggio, "Optical Flow from 1D Correlation: Application to a Simple Time-to-Crash Detector," Proc. Fourth IEEE Int'l Conf. Computer Vision, pp. 209-214, 1993,
[9] J.-C. Zufferey and D. Floreano, "Fly-Inspired Visual Steering of an Ultralight Indoor Aircraft," IEEE Trans. Robotics, vol. 22, no. 1, pp. 137-146, Feb. 2006.
[10] D. Coombs, M. Herman, T. Hong, and M. Nashman, "Real-Time Obstacle Avoidance Using Central Flow Divergence, and Peripheral Flow," IEEE Trans. Robotics Automation, vol. 14, no. 1, pp. 49-59, Feb. 1998.
[11] S. Bermùdez, P. Pyk, and P. Verschure, "A Fly-Locust Based Neuronal Control System Applied to an Unmanned Aerial Vehicle: The Invertebrate Neuronal Principles for Course Stabilization, Altitude Control and Collision Avoidance," Int'l J. Robotics Research, vol. 26, no. 7, pp. 759-772, 2007.
[12] M. Tistarelli and G. Sandini, "On the Advantages of Polar and Log-Polar Mapping for Direct Estimation of Time-to-Impact from Optical Flow," IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 15, no. 4, pp. 401-410, Apr. 1993.
[13] J. Santos-Victor and G. Sandini, "Visual Behaviors for Docking," Computer Vision and Image Understanding, vol. 67, no. 3, pp. 223-238, 1997.
[14] J.S. Chahl, M.V. Srinivasan, and S.W. Zhang, "Landing Strategies in Honeybees and Applications to Uninhabited Airborne Vehicles," Int'l J. Robotics Research, vol. 23, no. 2, pp. 101-110, 2004.
[15] R. Cipolla and A. Blake, "Image Divergence and Deformation from Closed Curves," Int'l J. Robotics Research, vol. 16, no. 1, pp. 77-96, 1997.
[16] P. Questa, E. Grossmann, and G. Sandini, "Camera Self Orientation and Docking Maneuver Using Normal Flow," Proc. SPIE, vol. 2488, pp. 274-283, 1995.
[17] J.J. Koenderink and A.J. van Doorn, "Local Structure of Movement Parallax of the Plane," J. Optical Soc. Am., vol. 66, no. 7, pp. 717-723, 1976.
[18] J.J. Koenderink and A.J. van Doorn, "Exterospecific Component of the Motion Parallax Field," J. Optical Soc. Am., vol. 71, no. 8, pp. 953-957, 1981.
[19] T. Brodsky, C. Fermüller, and Y. Aloimonos, "Direction of Motion Fields Are Hardly Ever Ambiguous," Int'l J. Computer Vision, vol. 26, no. 1, pp. 5-24, 1998.
[20] J. Koenderink and A.V. Doorn, "Invariant Properties of the Motion Parallax Field Due to the Movement of Rigid Bodies Relative to an Observer," Optica Acta, vol. 22, no. 9, pp. 773-791, 1975.
[21] M. Subbarao, "Bounds on Time-to-Collision and Rotational Component from First-Order Derivatives of Image Flow," Computer Vision, Graphics, and Image Processing, vol. 50, pp. 329-341, 1990.
[22] C. McCarthy, N. Barnes, and R. Mahony, "A Robust Docking Strategy for a Mobile Robot Using Flow Field Divergence," IEEE Trans. Robotics, vol. 24, no. 4, pp. 832-842, Aug. 2008.
[23] F.G. Meyer, "Time-to-Collision from First-Order Models of the Motion Field," IEEE Trans. Robotics and Automation, vol. 10, no. 6, pp. 792-798, Dec. 1994.
[24] J. Santos-Victor and G. Sandini, "Uncalibrated Obstacle Detection Using Normal Flow," Machine Vision and Applications, vol. 9, no. 3, pp. 130-137, 1996.
[25] W. Green, P. Oh, and G. Barrows, "Flying Insect Inspired Vision for Autonomous Aerial Robot Maneuvers in Near-Earth Environments," Proc. IEEE Int'l Conf. Robotics and Automation, vol. 3, pp. 2347-2352, 2004.
[26] P. Questa and G. Sandini, "Time to Contact Computation with a Space-Variant Retina-Like c-Mos Sensor," Proc. IEEE/RSJ Int'l Conf. Intelligent Robots and Systems, vol. 3, pp. 1622-1629, 1996.
[27] Z. Duric, A. Rosenfeld, and J. Duncan, "The Applicability of Green's Theorem to Computation of Rate of Approach," Int'l J. Computer Vision, vol. 31, no. 1, pp. 83-98, 1999.
[28] M.I.A. Lourakis and S.C. Orphanoudakis, "Using Planar Parallax to Estimate the Time-to-Contact," Proc. 13th IEEE Conf. Computer Vision and Pattern Recognition, vol. 2, pp. 640-645, 1999.
[29] F. Ruffier and N. Franceschini, "Optic Flow Regulation: The Key to Aircraft Automatic Guidance," Robotics and Autonomous Systems, vol. 50, pp. 177-194, 2005.
[30] C. Colombo and A. Del Bimbo, "Generalized Bounds for Time to Collision from First-Order Image Motion," Proc. Seventh IEEE Int'l Conf. Computer Vision, vol. 1, pp. 220-226, 1999.
[31] B. Lucas and T. Kanade, "An Iterative Image Registration Technique with an Application to Stereo Vision," Proc. DARPA Image Understanding Workshop, pp. 121-130, 1981.
[32] J.-Y. Bouguet, "Pyramidal Implementation of the Lucas Kanade Feature Tracker Description of the Algorithm," OpenCV Documentation, Intel Corp., 2000.
6 ms
(Ver 2.0)

Marketing Automation Platform Marketing Automation Tool