This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops
Autonomous navigation and mapping using monocular low-resolution grayscale vision
Anchorage, AK, USA
June 23-June 28
ISBN: 978-1-4244-2339-2
Vidya N. Murali, Electrical and Computer Engineering Department, Clemson University, SC 29634, USA
Stanley T. Birchfield, Electrical and Computer Engineering Department, Clemson University, SC 29634, USA
An algorithm is proposed to answer the challenges of autonomous corridor navigation and mapping by a mobile robot equipped with a single forward-facing camera. Using a combination of corridor ceiling lights, visual homing, and entropy, the robot is able to perform straight line navigation down the center of an unknown corridor. Turning at the end of a corridor is accomplished using Jeffrey divergence and time-to-collision, while deflection from dead ends and blank walls uses a scalar entropy measure of the entire image. When combined, these metrics allow the robot to navigate in both textured and untextured environments. The robot can autonomously explore an unknown indoor environment, recovering from difficult situations like corners, blank walls, and initial heading toward a wall. While exploring, the algorithm constructs a Voronoi-based topo-geometric map with nodes representing distinctive places like doors, water fountains, and other corridors. Because the algorithm is based entirely upon low-resolution (32 × 24) grayscale images, processing occurs at over 1000 frames per second.
Citation:
Vidya N. Murali, Stanley T. Birchfield, "Autonomous navigation and mapping using monocular low-resolution grayscale vision," cvprw, pp.1-8, 2008 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, 2008
Usage of this product signifies your acceptance of the Terms of Use.