This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Omni-Directional Stereo
February 1992 (vol. 14 no. 2)
pp. 257-262

Omnidirectional views of an indoor environment at different locations are integrated into a global map. A single camera swiveling about the vertical axis takes consecutive images and arranges them into a panoramic representation, which provides rich information around the observation point: a precise omnidirectional view of the environment and coarse ranges to objects in it. Using the coarse map, the system autonomously plans consecutive observations at the intersections of lines connecting object points, where the directions of the imaging are estimated easily and precisely. From two panoramic views at the two planned locations, a modified binocular stereo method yields a more precise, but with direction-dependent uncertainties, local map. New observation points are selected to decrease the uncertainty, and another local map is yielded, which is then integrated into a more reliable global representation of the world with the adjacent local maps.

[1] J. Aloimonous, A. Bandopdhay, and I. Weiss, "Active vision," inProc. 1st Int. Conf. Comput. Vision, 1987, pp. 35-54.
[2] D. H. Ballard, "Reference frame for animate vision," inProc. Int. Joint Conf. Artificial Intell., 1989, pp. 1635-1641.
[3] G. Sandini and M. Tistarelli, "Active tracking strategy for monocular depth inference over multiple frames,"IEEE Trans. Pattern Anal. Mach. Intell., vol. 12, no. 1, 1990.
[4] N. Ayache and O. D. Faugeras, "Building, registrating and fusing noisy visual maps," inProc. 1st Int. Conf. Comput. Vision, 1987, pp. 73-82.
[5] R. C. Bolleset al., "Epipolar-plane image analysis: An approach to determining structure from motion,"Int. J. Comput. Vision, vol. 1, no. 1, 1987, pp. 7-56.
[6] J.Y. Zheng and S. Tsuji, "Panoramic representation of scenes for route understanding," inProc. 10th Int. Conf. Patt. Recogn., 1990, pp. 161-167.
[7] S. J. Oh and E. L. Hall, "Guidance of a mobile robot using an omnidirectional vision navigation system," inProc. SPIE, vol. 852, 1987.
[8] T. Moritaet al., "Measurement in three dimensions by motion stereo and spherical mapping," inProc. Comput. Vision Patt. Recogn., 1989, pp. 422-428.
[9] R. A. Jarvis and J. C. Byrne, "An automated guided vehicle with map building and path finding capabilities," inProc. 4th Int. Symp. Robotics Res., 1988, pp. 497-504.
[10] Y. Yagi and S. Kawato, "Panorama scene analysis with conic projection," inProc. IEEE Int. Workshop Intell. Robots Syst., 1990, pp. 181-190.
[11] K.B. Sarachik, "Characterizing an indoor environment with a mobile robot and uncalibrated stereo." inProc. IEEE Int. Conf. Robotics Automat., 1989, pp. 984-989.
[12] H. Ishiguro, M. Yamamoto, and S. Tsuji, "Acquiring precise range information from camera motion." inProc. IEEE Int. Conf. Robotics Automat., 1991.
[13] Y. Lui and T. S. Huang, "Estimation of rigid body motion using straight line correspondences: Further results," inProc. 8th Int. Conf. Patt. Recogn., 1986, pp. 306-307.
[14] A. Mitiche, S. Seida, and J. K. Aggarwal, "Interpretation of structure and motion using straight line correspondences," inProc. 8th Int. Conf. Patt. Recogn., 1986, pp. 1110-1112.
[15] K. Sugihara, "Some location problems for robot navigation using a single camera,"Comput. Vision Graphics Image Processing, vol. 42, pp. 112-129, 1988.

Index Terms:
active vision; computer vision; omnidirectional stereo vision; autonomous planning; global map; panoramic representation; coarse map; local map; computer vision; computerised pattern recognition; planning (artificial intelligence)
Citation:
H. Ishiguro, M. Yamamoto, S. Tsuji, "Omni-Directional Stereo," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 14, no. 2, pp. 257-262, Feb. 1992, doi:10.1109/34.121792
Usage of this product signifies your acceptance of the Terms of Use.