Hostname: page-component-586b7cd67f-t8hqh Total loading time: 0 Render date: 2024-11-22T04:23:46.407Z Has data issue: false hasContentIssue false

Homing with stereovision

Published online by Cambridge University Press:  28 May 2015

Paramesh Nirmal
Affiliation:
Robotics and Computer Vision Lab, Fordham University, Bronx, NY, USA
Damian M. Lyons*
Affiliation:
Robotics and Computer Vision Lab, Fordham University, Bronx, NY, USA
*
*Corresponding author. E-mail: [email protected]

Summary

Visual Homing is a navigation method based on comparing a stored image of a goal location to the current image to determine how to navigate to the goal location. It is theorized that insects such as ants and bees employ visual homing techniques to return to their nest or hive, and inspired by this, several researchers have developed elegant robot visual homing algorithms. Depth information, from visual scale, or other modality such as laser ranging, can improve the quality of homing. While insects are not well equipped for stereovision, stereovision is an effective robot sensor. We describe the challenges involved in using stereovision derived depth in visual homing and our proposed solutions. Our algorithm, Homing with Stereovision (HSV), utilizes a stereo camera mounted on a pan-tilt unit to build composite wide-field stereo images and estimate distance and orientation from the robot to the goal location. HSV is evaluated in a set of 200 indoor trials using two Pioneer 3-AT robots showing it effectively leverages stereo depth information when compared to a depth from scale approach.

Type
Articles
Copyright
Copyright © Cambridge University Press 2015 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1. Cartweight, B. and Collet, T., “Landmark learning in bees,” J. Comparative Physiol. 151, 521543 (1983).Google Scholar
2. Choi, D., Shim, I., Bok, Y., Oh, T. and Kweon, I., “Autonomous homing based on laser-camera fusion system,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, (2012).Google Scholar
3. Churchill, D. and Vardy, A., “Homing in scale space,” Proceedings of the IEEE/RSJ Conference on Intelligent Robots and Systems (IROS), (2008).Google Scholar
4. Churchill, D. and Vardy, A., “An orientation invarient visual homing algorithm,” J. Intell. Robot. Syst. 17 (1), 329 (2012).Google Scholar
5. Dudek, G. and Jenkin, M., Computational Principles of Mobile Robotics (Cambridge University Press, Cambridge, 2000).Google Scholar
6. Feng, W., Zhang, B., Roning, J., Zong, X. and Yi, T., “Panoramic Stereo Vision,” Proceedings of the SPIE Conference on Intelligent Robotics and Computer Vision XXX: Algorithms and Techniques, Burlingame CA (2013).Google Scholar
7. Fischler, M. and Bolles, R., “Random sample concensus: A paradigm for model fitting with applications to image abalysis and automated cartography,” Comm. ACM 24 (6), 381395 (1981).Google Scholar
8. Franz, M., Scholkopf, B., Mallot, M. and Bulthoff, H., Where did I take that snapshot? Scene-based homing by image matching. Biol. Cybern. (79), 191202 (1998).Google Scholar
9. Jin, Y. and Xie, M., “Vision guided Homing for Humanoid Service Robot,” Proceedings of the 15th International Conference on Pattern Recognition (ICPR), Vol. 4 (2000).Google Scholar
10. Liu, M., Pradalier, C., Pomerleau, F. and Siegwart, R., “The Role of Homing in Visual Topological Navigation,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), (2012).Google Scholar
11. Lowe, D., “Distinctive image features from scaleinvarient keypoints,” J. Comput. Vis. 60 (2), 91110 (2004).Google Scholar
12. Möller, R., “Do insects use templates or parameters for landmark navigation?,” J. Theor. Biol. 210, 3345 (2001).Google Scholar
13. Möller, R., “Local visual homing by warping of two-dimensional images,” Robot. Auton. Syst. 31 (1), 87101 (2009).Google Scholar
14. Möller, R. and Vardy, A., “Local visual homing by matched-filter descent in image databases,” Biol. Cybern. 95, 413430 (2006).Google Scholar
15. Möller, R., Lambrinos, D., Pfeifer, R. and Wehner, R., “Insect strategies of visual homing in mobile robots,” Comput. Vis. Mobile Robot. Workshop (1998).Google Scholar
16. Nirmal, P. and Lyons, D., “Visual Homing with a Pan-Tilt Based stereo camera,” Proceedings of the SPIE Conference on Intelligent Robots and Computer Vision XXX: Algorithms and Techniques, Burlingame CA (2013).Google Scholar
17. Pons, J., Huhner, W., Dahmen, J. and Mallot, H., “Vision-Based Robot Homing in Dynamic Environments,” Proceedings of the 13th IASTED International Conference on Robotics and Applications (2007).Google Scholar
18. Srinivasan, M., “Insects as gibsonian animals,” Ecol. Psychol. 10 (3–4), 251270 (1998).Google Scholar
19. Sturzl, W. and Mallot, H., “Vision-Based Homing with a Panoramic Stereovision Sensor,” Proceedings of the British Machine Vision Conference 2002 LNCS 2525 (2002).Google Scholar
20. Thrun, S., Burgard, W. and Fox, D., Probabilistic Robotics, (MIT Press, Cambridge, MA, 2005).Google Scholar
21. Vardy, A. and Möller, R., “Biologically plausible visual homing methods based on optical flow techniques,” Connect. Sci. 17, 4790 (2005).Google Scholar
22. Vardy, A. and Oppacher, F., “Low-level visual homing,” Advances in artificial life - Proceedings, 7th European Concerence on Artificial Life (vol. 2801 Lecture Notes in Artificial Intelligence) (2003).Google Scholar
23. Wehner, R., “Spatial Vision in Anthropods,” In: Handbook of Sensory Physiology VII/6C, Comparative physiology and evolution of vision in vertebrates (1981).Google Scholar
24. Zeil, J., Hoffman, H. and Chal, J., “Catchment areas of panoramic images in outdoor scenes,” J. Opt. Soc. Am. 20 (3), 450469 (2003).Google Scholar