Hostname: page-component-586b7cd67f-gb8f7 Total loading time: 0 Render date: 2024-11-20T06:35:56.121Z Has data issue: false hasContentIssue false

Combined visual odometry and visual compass for off-road mobile robots localization

Published online by Cambridge University Press:  05 October 2011

Ramon Gonzalez*
Affiliation:
Department of Languages and Computation, University of Almería, Almería, Spain
Francisco Rodriguez
Affiliation:
Department of Languages and Computation, University of Almería, Almería, Spain
Jose Luis Guzman
Affiliation:
Department of Languages and Computation, University of Almería, Almería, Spain
Cedric Pradalier
Affiliation:
Autonomous Systems Lab, ETH Zurich, Zurich, Switzerland
Roland Siegwart
Affiliation:
Autonomous Systems Lab, ETH Zurich, Zurich, Switzerland
*
*Corresponding author. E-mail: [email protected].

Summary

In this paper, we present the work related to the application of a visual odometry approach to estimate the location of mobile robots operating in off-road conditions. The visual odometry approach is based on template matching, which deals with estimating the robot displacement through a matching process between two consecutive images. Standard visual odometry has been improved using visual compass method for orientation estimation. For this purpose, two consumer-grade monocular cameras have been employed. One camera is pointing at the ground under the robot, and the other is looking at the surrounding environment. Comparisons with popular localization approaches, through physical experiments in off-road conditions, have shown the satisfactory behavior of the proposed strategy.

Type
Articles
Copyright
Copyright © Cambridge University Press 2011

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1.Thrun, S., Thayer, S., Whittaker, W., Baker, C., Burgard, W., Ferguson, D., Haehnel, D., Montemerlo, M., Morris, A., Omohundro, Z., Reverte, C. and Whittaker, W., “Autonomous exploration and mapping of abandoned mines,” IEEE Robot. Autom. Mag. 11 (1), 7991 (2004).CrossRefGoogle Scholar
2.Borenstein, J., “The CLAPPER: A Dual-Drive Mobile Robot with Internal Correction of Dead-Reckoning Errors,” In: Proceedings of IEEE Conference on Robotics and Automation, IEEE, San Diego, CA, USA (May 8–13, 1994) pp. 30853090.Google Scholar
3.Horn, O. and Kreutner, M., “Smart wheelchair perception using odometry, ultrasound sensors, and camera,” Robotica 27 (2), 303310 (2009).CrossRefGoogle Scholar
4.Beom, H. and Cho, H., “Mobile robot localization using a single rotating sonar and two passive cylindrical beacons,” Robotica 13 (3), 243252 (1995).CrossRefGoogle Scholar
5.Cho, S. and Lee, J., “Localization of a high-speed mobile robot using global features,” Robotica 29 (5), 757765 (2010) Available on CJO.CrossRefGoogle Scholar
6.Siegwart, R. and Nourbakhsh, I., Introduction to Autonomous Mobile Robots, 1st ed. A Bradford book. (The MIT Press, Cambridge, MA, USA, 2004).Google Scholar
7.Hofmann-Wellenhof, B., Lichtenegger, H. and Collins, J., Global Positioning System: Theory and Practice, 5th ed. (Springer, Germany, 2001).CrossRefGoogle Scholar
8.Johnson, A., Goldberg, S., Yang, C. and Matthies, L., “Robust and Efficient Stereo Feature Tracking for Visual Odometry,” In: Proceedings of IEEE International Conference on Robotics and Automation, IEEE, Pasadena, USA (May 19–23, 2008) pp. 3946.Google Scholar
9.Matthies, L., Maimone, M., Johnson, A., Cheng, Y., Willson, R., Villalpando, C., Goldberg, S. and Huertas, A., “Computer vision on mars,” Int. J. Comput. Vis. 75 (1), 6792 (2007).CrossRefGoogle Scholar
10.Olson, C., Matthies, L., Schoppers, M. and Maimone, M., “Rover navigation using stereo ego-motion,” Robot. Auton. Syst. 43 (4), 215229 (2003).CrossRefGoogle Scholar
11.Parra, I., Sotelo, M., Llorce, D. and Na, M. O., “Robust visual odometry for vehicle localization in urban environments,” Robotica 28 (3), 441452 (2010).CrossRefGoogle Scholar
12.Campbell, J., Sukthankar, R., Nourbakhsh, I. and Pahwa, A., “A Robust Visual Odometry and Precipice Detection System Using Consumer-grade Monocular Vision,” In: Proceedings of IEEE International Conference on Robotics and Automation, IEEE, Barcelona, Spain (Apr. 18–22, 2005) pp. 34213427.Google Scholar
13.Matthies, L., “Dynamic Stereo Vision,” Ph.D Thesis (Pittsburgh, USA: Carnegie Mellon University, 1989).Google Scholar
14.González, R., Rodríguez, F., Sánchez-Hermosilla, J. and Donaire, J., “Navigation techniques for mobile robots in greenhouses,” Appl. Eng. Agr. 25 (2), 153165 (2009).CrossRefGoogle Scholar
15.González, R., “Localization of the CRAB rover using Visual Odometry,” Technical Report (Autonomous Systems Lab, ETH Zürich, Switzerland), available at: http://www.ual.es/personal/rgonzalez/english/publications.htm (2009) online. (Accessed September 2011).Google Scholar
16.Nistér, D., Naroditsky, O. and Bergen, J., “Visual odometry for ground vehicle applications,” J. Field Robot. 23 (1), 320 (2006).CrossRefGoogle Scholar
17.Angelova, A., Matthies, L., Helmick, D. and Perona, P., “Learning and prediction of slip from visual information,” J. Field Robot. 24 (3), 205231 (2007).CrossRefGoogle Scholar
18.Lowe, D., “Distinctive image features from scale-invariant keypoints,” Int. J. Comput. Vis. 60 (2), 91110 (2004).CrossRefGoogle Scholar
19.Lucas, B. and Kanade, T., “An Iterative Image Registration Technique with An Application to Stereo Vision,” In: Proceedings of DARPA Imaging Understanding Workshop, DARPA, Monterey, USA (Aug. 24–28, 1981) pp. 121130.Google Scholar
20.Scaramuzza, D., “Omnidirectional Vision: From Calibration to Robot Motion Estimation,” Ph.D Thesis (Zürich, Switzerland: Swiss Federal Institute of Technology, 2008).Google Scholar
21.Matiukhin, V., “Trajectory Stabilization of Wheeled System,” In: Proceedings of IFAC World Congress, IFAC, Seoul, Korea (2008) pp. 11771182.Google Scholar
22.Brunelli, R., Template Matching Techniques in Computer Vision: Theory and Practice, (John Wiley, New Jersey, USA, 2009).CrossRefGoogle Scholar
23.Goshtasby, A., Gage, S. and Bartholic, J., “A two-stage correlation approach to template matching,” IEEE Trans. Pattern Anal. Mach. Intell. 6 (3), 374378 (1984).CrossRefGoogle ScholarPubMed
24.Labrosse, F., “The visual compass: Performance and limitations of an appearance-based method,” J. Field Robot. 23 (10), 913941 (2006).CrossRefGoogle Scholar
25.Srinivasan, M., “An image-interpolation technique for the computation of optic flow and egomotion,” J. Biol. Cybern. 71 (5), 401415 (1994).CrossRefGoogle Scholar
26.Kim, S. and Lee, S., “Robust mobile robot velocity estimation using a polygonal array of optical mice,” Int. J. Inf. Acquis. 5 (4), 321330 (2008).CrossRefGoogle Scholar
27.Nourani-Vatani, N., Roberts, J. and Srinivasan, M., “Practical Visual Odometry for Car–like Vehicles,” In: Proceedings of IEEE International Conference on Robotics and Automation, IEEE, Kobe, Japan (May 12–17, 2009) pp. 35513557.Google Scholar
28.Bradski, G. and Kaehler, A., Learning OpenCV: Computer Vision with the OpenCV Library, (OóReilly, Sebastopol, CA, USA, 2008).Google Scholar
29.Rodgers, J. and Nicewander, W., “Thirteen ways to look at the correlation coefficient,” Am. Stat. 42 (1), 5966 (1988).CrossRefGoogle Scholar
30.Elmadany, M. and Abduljabbar, Z., “On the statistical performance of active and semi-active car suspension systems,” Comput. Struct. 33 (3), 785790 (1989).CrossRefGoogle Scholar
31.Nagatani, K., Ikeda, A., Ishigami, G., Yoshida, K. and Nagai, I., “Development of a visual odometry system for a wheeled robot on loose soil using a telecentric camera,” Adv. Robot. 24 (8–9), 11491167 (2010).CrossRefGoogle Scholar
32.Montiel, J. and Davison, A., “A Visual Compass based on SLAM,” In: Proceedings of IEEE International Conference on Robotics and Automation, IEEE, Orlando, USA (May 15–19, 2006) pp. 19171922.Google Scholar
33.Sturm, J. and Visser, A., “An appearance-based visual compass for mobile robots,” Robot. Auton. Syst. 57 (5), 536545 (2009).CrossRefGoogle Scholar
34.Bouguet, J., “Camera calibration toolbox for Matlab,” available at: http://www.vision.caltech.edu/bouguetj/calib_doc/ (2008). (Accessed September 2011).Google Scholar
35.Pretto, A., Menegatti, E., Bennewitz, M., Burgard, W. and Pagello, E., “A Visual Odometry Framework Robust to Motion Blur,” In: Proceedings of IEEE International Conference on Robotics and Automation, IEEE, Kobe, Japan (May 12–17, 2009) pp. 16851692.Google Scholar
36.Hornung, A., Bennewitz, M. and Strasdat, H., “Efficient vision-based navigation. Learning about the influence of motion blur,” Auton. Robots 29 (2), 137149 (2010).CrossRefGoogle Scholar