Hostname: page-component-cd9895bd7-jn8rn Total loading time: 0 Render date: 2024-12-24T17:18:51.887Z Has data issue: false hasContentIssue false

Vision-only egomotion estimation in 6DOF using a sky compass

Published online by Cambridge University Press:  25 July 2018

Tasarinan Jouir*
Affiliation:
Queensland Brain Institute, QBI Building 79, University of Queensland, St Lucia, Brisbane, Queensland, Australia, 4072
Reuben Strydom
Affiliation:
Queensland Brain Institute, QBI Building 79, University of Queensland, St Lucia, Brisbane, Queensland, Australia, 4072 School of Information Technology and Electrical Engineering, General Purpose South Building 78, University of Queensland, St Lucia, Brisbane, Queensland, Australia, 4072 E-mails: [email protected], [email protected]
Thomas M. Stace
Affiliation:
School of Mathematics and Physics, Parnell Building 07, University of Queensland, St Lucia, Brisbane, Queensland, Australia, 4072. E-mail: [email protected]
Mandyam V. Srinivasan
Affiliation:
Queensland Brain Institute, QBI Building 79, University of Queensland, St Lucia, Brisbane, Queensland, Australia, 4072 School of Information Technology and Electrical Engineering, General Purpose South Building 78, University of Queensland, St Lucia, Brisbane, Queensland, Australia, 4072 E-mails: [email protected], [email protected]
*
*Corresponding author. E-mail: [email protected]

Summary

A novel pure-vision egomotion estimation algorithm is presented, with extensions to Unmanned Aerial Systems (UAS) navigation through visual odometry. Our proposed method computes egomotion in two stages using panoramic images segmented into sky and ground regions. Rotations (in 3DOF) are estimated by using a customised algorithm to measure the motion of the sky image, which is affected only by the rotation of the aircraft, and not by its translation. The rotation estimate is then used to derotate the optic flow field generated by the ground, from which the translation of the aircraft (in 3DOF) is estimated by another customised, iterative algorithm. Segmentation of the rotation and translation estimations allows for a partial relaxation of the planar ground assumption, inherently increasing the robustness of the approach. The translation vectors are scaled using stereo-based height to compute the current UAS position through path integration for closed-loop navigation. Outdoor field tests of our approach in a small quadrotor UAS suggest that the technique is comparable to the performance of existing state-of-the-art vision-based navigation algorithms, whilst also removing all dependence on additional sensors, such as an IMU or global positioning system (GPS).

Type
Articles
Copyright
Copyright © Cambridge University Press 2018 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1. Rohac, J., “Accelerometers and an aircraft attitude evaluation,” Proceedings of the IEEE Sensors Conference, Irvine, CA (2005) pp. 784–789.Google Scholar
2. Merhav, S., “Inertial rotation sensors,” In: Aerospace Sensor Systems and Applications (Springer, New York, 1996) pp. 186271.Google Scholar
3. Goecke, R., Asthana, A., Pettersson, N. and Petersson, L., “Visual Vehicle Egomotion Estimation Using the Fourier–Mellin Transform,” Proceedings of the IEEE Intelligent Vehicles Symposium (2007) pp. 450–455.Google Scholar
4. Lucas, B. D. and Kanade, T., “An iterative image registration technique with an application to stereo vision,” Proceedings of the International Joint Conference on Artificial Intelligence (1981) pp. 674–679.Google Scholar
5. Srinivasan, M. V., “An image-interpolation technique for the computation of optic flow and egomotion,” Biol. Cybern. 71 (5), 401415 (1994).Google Scholar
6. Grabe, V., Bülthoff, H. H., Scaramuzza, D. and Giordano, P. R., “Nonlinear ego-motion estimation from optical flow for online control of a quadrotor UAV,” Int. J. Robot. Res. 34 (8), 11141135 (2015).Google Scholar
7. Eynard, D., Vasseur, P., Demonceaux, C. and Frémont, V., “Real time UAV altitude, attitude and motion estimation from hybrid stereovision,” Autonom. Robots 33, 157172 (2012).Google Scholar
8. Zufferey, J. C., Beyeler, A. and Floreano, D., “Optic flow to steer and avoid collisions in 3D,” In: Flying Insects and Robots (Springer, Berlin, Heidelberg, 2009) pp. 7386.Google Scholar
9. Srinivasan, M. V., Moore, R. J. D., Thurrowgood, S., Soccol, D. and Bland, D., “From biology to engineering: Insect vision and applications to robotics,” In: Frontiers in Sensing (Springer, Vienna, 2012) pp. 1939.Google Scholar
10. Viollet, S. and Zeil, J.Feed-forward and visual feedback control of head roll orientation in wasps (Polistes humilis, Vespidae, Hymenoptera),” J. Exp. Biol. 216 (7), 12801291 (2013).Google Scholar
11. Wehner, R., “Desert ant navigation: How miniature brains solve complex tasks,” J. Comp. Physiol. 189, 579588 (2003).Google Scholar
12. Lambrinos, D., Kobayashi, H., Pfeifer, R., Maris, M. and Labhart, T., “An adaptive agent navigating with a polarised light compass,” Adapt. Behav. 6, 131161 (1997).Google Scholar
13. Lambrinos, D., Moller, R., Labhart, T., Pfeifer, R. and Wehner, R.A mobile robot employing insect strategies for navigation,” Robot. Auton. Syst. 30, 3964 (2000).Google Scholar
14. Chahl, J. and Mizutani, A.Biomimetic attitude and orientation sensors,” IEEE Sens. J. 12 (2), 289297 (2012).Google Scholar
15. Dupeyroux, J., Diperi, J., Boyron, M., Viollet, S. and Serres, J., “A Bio-Inspired Celestial Compass Applied to An Ant-Inspired Robot for Autonomous Navigation,” Proceedings of the European Conference on Mobile Robotics (ECMR) (2017).Google Scholar
16. Yang, Z., Zhou, J., Huang, H., Liu, Y. and Li, Z.Measuring solar vector with polarization sensors based on polarization pattern,” Optik 141, 147156 (2017).Google Scholar
17. Wilson, M., “Functional organization of locust ocelli,” J. Comp. Physiol. 124, 297316 (1978).Google Scholar
18. Berry, R., van Kleef, J. and Stange, G., “The mapping of visual space by dragonfly lateral ocelli,” J. Comp. Physiol. 193, 495513 (2007).Google Scholar
19. Thurrowgood, S., Moore, R. J. D., Bland, D., Soccol, D. and Srinivasan, M. V., “UAV Attitude Control Using the Visual Horizon,” Proceedings of the Australasian Conference on Robotics and Automation, Brisbane (2010).Google Scholar
20. Moore, R. J. D., Thurrowgood, S., Bland, D., Soccol, D. and Srinivasan, M., “A fast and adaptive method for estimating UAV attitude from the visual horizon,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (2011) pp. 4935–4940.Google Scholar
21. Moore, R. J., Thurrowgood, S., Soccol, D., Bland, D. and Srinivasan, M. V., “A Method for the Visual Estimation and Control of 3–DOF Attitude for UAVs,” Proceedings of the Australasian Conference on Robotics and Automation, Melbourne (2011).Google Scholar
22. Loianno, G., Watterson, M. and Kumar, V., “Visual inertial odometry for quadrotors on SE (3),” Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (2016) pp. 1544–1551.Google Scholar
23. Thurrowgood, S., Moore, R. J., Soccol, D., Knight, M. and Srinivasan, M. V., “A biologically inspired, vision based guidance system for automatic landing of a fixed–wing aircraft,” J. Field Robot. 31 (4), 699727 (2014).Google Scholar
24. Strydom, R., Thurrowgood, S. and Srinivasan, M., “Visual Odometry: Autonomous UAV Navigation using Optic Flow and Stereo,” Proceedings of the Australasian Conference on Robotics and Automation, Melbourne (2014).Google Scholar
25. Denuelle, A. and Srinivasan, M. V., “Snapshot–Based Navigation for the Guidance of uas,” Proceedings of the Australasian Conference on Robotics and Automation, Canberra (2015).Google Scholar
26. Jouir, T., Strydom, R. and Srinivasan, M. V., “A 3d Sky Compass to Achieve Robust Estimation of UAV Attitude,” Proceedings of the Australasian Conference on Robotics and Automation, Canberra (2015).Google Scholar
27. Kannala, J. and Brandt, S. S., “A generic camera model and calibration method for conventional, wide-angle, and fish-eye lenses,” IEEE Trans. Pattern Anal. Mach. Intell. 28 (8), 13351340 (2006).Google Scholar
28. Zeil, J., Hofmann, M. I. and Chahl, J. S., “Catchment areas of panoramic snapshots in outdoor scenes,” J. Opt. Soc. Am. 20 (3), 450469 (2003).Google Scholar
29. Campbell, J., Sukthankar, R., Nourbakhsh, I. and Pahwa, A., “A Robust Visual Odometry and Precipice Detection System using Consumer–Grade Monocular Vision,” Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (2005) pp. 3421–3427Google Scholar
30. Johnson, S. G., “The NLopt nonlinear–optimization package,” (2014).Google Scholar
31. Powell, M. J., “The BOBYQA algorithm for bound constrained optimization without derivatives,” Cambridge NA Report NA2009/06, (University of Cambridge, Cambridge, 2009) pp. 2646.Google Scholar
32. Shimizu, M. and Okutomi, M., “Significance and attributes of subpixel estimation on area–based matching,” Syst. Comput. Japan 34 (12), 110 (2003).Google Scholar
33. Denuelle, A., Strydom, R. and Srinivasan, M. V., “Snapshot-Based Control of UAS Hover in Outdoor Environments,” Proceedings of the IEEE International Conference on Robotics and Biomimetics (ROBIO) (2015) pp. 1278–1284.Google Scholar
34. Strydom, R., Denuelle, A. and Srinivasan, M. V., “Bio-inspired principles applied to the guidance, navigation and control of UAS,” Aerospace 3 (3), 21 (2016).Google Scholar
35. Olson, C. F., Matthies, L. H., Schoppers, M. and Maimone, M. W., “Robust Stereo Ego-Motion for Long Distance Navigation,” Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Hilton Head Island, SC, USA, vol. 2 (2000) pp. 453–458Google Scholar
36. Nistér, D., Naroditsky, O. and Bergen, J., “Visual Odometry,” Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Washington, DC, USA, vol. 1 (2004) pp. 652–659Google Scholar
37. Maimone, M., Cheng, Y. and Matthies, L., “Two years of visual odometry on the mars exploration rovers,” J. Field Robot. 24, 169186 (2007).Google Scholar
38. Nistér, D., Naroditsky, O. and Bergen, J., “Visual odometry for ground vehicle applications,” J. Field Robot. 23, 320 (2006).Google Scholar
39. Tardif, J. P., Pavlidis, Y. and Daniilidis, K., “Monocular Visual Odometry in Urban Environments Using an Omnidirectional Camera,” Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Nice, France (2008) pp. 2531–2538Google Scholar
40. Heng, L., Honegger, D., Lee, G. H., Meier, L., Tanskanen, P., Fraundorfer, F. and Pollefeys, M., “Autonomous visual mapping and exploration with a micro aerial vehicle,” J. Field Robot. 31, 654675 (2014).Google Scholar
41. Nourani-Vatani, N. and Borges, P. V. K., “Correlation–based visual odometry for ground vehicles,” J. Field Robot. 28, 742768 (2011).Google Scholar
42. Lemaire, T., Berger, C., Jung, I. K. and Lacroix, S., “Vision-based slam: Stereo and monocular approaches,” Int. J. Comput. Vis. 74, 343364 (2007).Google Scholar
43. Bellman, R., “Adaptive Control Processes: A Guided Tour” (Princeton University Press, Princeton, NJ, 2015).Google Scholar
44. Srinivasan, M. V., Thurrowgood, S. and Soccol, D., “From Visual Guidance in Flying Insects to Autonomous Aerial Vehicles,” Flying Insects and Robots (Springer, Berlin, Heidelberg, 2009) pp. 1528.Google Scholar
45. Srinivasan, M. V., “Honeybees as a model for the study of visually guided flight, navigation, and biologically inspired robotics,” Physiol. Rev. 91, 389411 (2011).Google Scholar