The Community for Technology Leaders
2016 International Conference on Frontiers of Information Technology (FIT) (2016)
Islamabad, Pakistan
Dec. 19, 2016 to Dec. 21, 2016
ISBN: 978-1-5090-5300-1
pp: 34-38
Jonathan Chandra , School of Electrical Engineering and Informatics, Bandung Institute of Technology, Ganesha 10, Bandung 40132, Jawa Barat, Indonesia
Ary Setijadi Prihatmanto , School of Electrical Engineering and Informatics, Bandung Institute of Technology, Ganesha 10, Bandung 40132, Jawa Barat, Indonesia
ABSTRACT
Navigation is a fundamental problem that relates to localization and positioning for a humanoid robot Nao. Odometry is the one of many techniques that able to solve it. Nao humanoid robot actually doesn't have an odometry module by itself and odometry sensors that currently available on it and can be used is only IMU module. An odometry sensor that quite interesting in recent times, usually for fusioning together with IMU, easier to mount and not so hard to obtain is a visual sensor camera. In this paper we will show how to implement and place stereo cameras in a Nao robot that has no mounting points for external cameras. By using stereo instead of monocular camera approach, we could obtain odomentry measurement scale in actual size. The software used in this paper is OpenCV that can handle all process regarding from image processing to pose estimation. Stereo visual odometry system that we have designed and implemented on a humaoid robot Nao now can be used to measure simple movement and robot rotation. By extending this research achievement, an odometry system now can be designed and implements on a bigger infrastructure, such as Lumen Robot Friend Platform. This also unlock research availability to do 3D scene reconstuction with a Nao robot and all another researches that based on localization.
INDEX TERMS
nao, navigation, visual, stereo, odometry, robot, humanoid
CITATION
Jonathan Chandra, Ary Setijadi Prihatmanto, "Stereo visual odometry system design on humanoid robot Nao", 2016 International Conference on Frontiers of Information Technology (FIT), vol. 00, no. , pp. 34-38, 2016, doi:10.1109/FIT.2016.7857534
190 ms
(Ver 3.3 (11022016))