Hostname: page-component-cd9895bd7-jn8rn Total loading time: 0 Render date: 2024-12-23T07:22:32.586Z Has data issue: false hasContentIssue false

Micro Aerial Vehicle Navigation with Visual-Inertial Integration Aided by Structured Light

Published online by Cambridge University Press:  01 July 2019

Yunshu Wang
Affiliation:
(Navigation Research Center, College of Automation Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing, 211106, China) (Jiangsu Key Laboratory of Internet of Things and Control Technologies (NUAA), Nanjing, 211106, China) (School of Civil and Environmental Engineering, University of New South Wales, Sydney, NSW 2052, Australia)
Jianye Liu
Affiliation:
(Navigation Research Center, College of Automation Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing, 211106, China) (Jiangsu Key Laboratory of Internet of Things and Control Technologies (NUAA), Nanjing, 211106, China)
Jinling Wang
Affiliation:
(School of Civil and Environmental Engineering, University of New South Wales, Sydney, NSW 2052, Australia)
Qinghua Zeng*
Affiliation:
(Navigation Research Center, College of Automation Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing, 211106, China) (Jiangsu Key Laboratory of Internet of Things and Control Technologies (NUAA), Nanjing, 211106, China)
Xuesong Shen
Affiliation:
(School of Civil and Environmental Engineering, University of New South Wales, Sydney, NSW 2052, Australia)
Yueyuan Zhang
Affiliation:
(Navigation Research Center, College of Automation Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing, 211106, China)
*

Abstract

Considering that traditional visual navigation cannot be utilised in low illumination and sparse feature environments, a novel visual-inertial integrated navigation method using a Structured Light Visual (SLV) sensor for Micro Aerial Vehicles (MAVs) is proposed in this paper. First, the measurement model based on an SLV sensor is studied and built. Then, using the state model based on error equations of an Inertial Navigation System (INS), the measurement model based on the error of the relative motion measured by INS and SLV is built. Considering that the measurements in this paper are mainly related to the position and attitude information of the present moment, the state error accumulation in traditional visual-inertial navigation can be avoided. An Adaptive Sage-Husa Kalman Filter (ASHKF) based on multiple weighting factors is proposed and designed to make full use of the SLV measurements. The results of the simulation and the experiment based on real flight data indicate that high accuracy position and attitude estimations can be obtained with the help of the algorithm proposed in this paper.

Type
Research Article
Copyright
Copyright © The Royal Institute of Navigation 2019 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

REFERENCES

Bouguet, J. Y. (2015). Camera calibration toolbox for MATLAB, http://www.vision.caltech.edu/bouguetj/calib_doc/index.html.Google Scholar
Cocchioni, F., Mancini, A. and Longhi, S. (2014). Autonomous navigation, landing and recharge of a quadrotor using artificial vision. Proceedings of the 2014 IEEE International Conference on Unmanned Aircraft Systems (ICUAS), 418–429, Orlando, FL, USA.Google Scholar
Ding, W., Wang, J., Rizos, C. and Kinlyside, D. (2007). Improving adaptive Kalman estimation in GPS/INS integration. The Journal of Navigation, 60(3), 517529.Google Scholar
Dougherty, J., Lee, D. and Lee, T. (2014). Laser-based guidance of a quadrotor UAV for precise landing on an inclined surface. American Control Conference (ACC), 1210–1215, Portland, OR, USA.Google Scholar
Engel, J., Sturm, J. and Cremers, D. (2014). Scale-aware navigation of a low-cost quadrocopter with a monocular camera. Robotics and Autonomous Systems, 62(11), 16461656.Google Scholar
Eschmann, C., Kuo, C. M., Kuo, C. H. and Boller, C. (2012). Unmanned aircraft systems for remote building inspection and monitoring. 6th European workshop on structural health monitoring, 1–8, Dresden, Germany.Google Scholar
Fernandes, L. A., and Oliveira, M. M. (2008). Real-time line detection through an improved Hough transform voting scheme. Pattern Recognition, 41(1), 299314.Google Scholar
Gao, X. and Zhang, T. (2015). Robust RGB-D simultaneous localization and mapping using planar point features. Robotics and Autonomous Systems, 72, 114.Google Scholar
Gioia, C. and Borio, D. (2014). Stand-alone and hybrid positioning using asynchronous pseudolites. Sensors, 15(1), 166193.Google Scholar
Harmat, A. and Sharf, I. (2014). Towards full omnidirectional depth sensing using active vision for small unmanned aerial vehicles. Proceedings of the 2014 IEEE Canadian Conference on Computer and Robot Vision (CRV), 24–31, Montreal, QC, Canada.Google Scholar
Huang, G., Kaess, M., and Leonard, J. J. (2014). Towards consistent visual-inertial navigation. Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), 4926–4933, Hong Kong, China.Google Scholar
Kelly, J. and Sukhatme, G. S. (2011). Visual-inertial sensor fusion: Localization, mapping and sensor-to-sensor self-calibration. The International Journal of Robotics Research, 30(1), 5679.Google Scholar
Kiddee, P., Fang, Z. and Tan, M. (2016). A practical and intuitive calibration technique for cross-line structured light. Optik-International Journal for Light and Electron Optics, 127(20), 95829602.Google Scholar
Kong, X., Wu, W., Zhang, L. and Wang, Y. (2015). Tightly-coupled stereo visual-inertial navigation using point and line features. Sensors, 15(6), 1281612833.Google Scholar
Li, W. and Wang, J. (2013). Effective adaptive Kalman filter for MEMS-IMU/magnetometers integrated attitude and heading reference systems. The Journal of Navigation, 66(1), 99113.Google Scholar
Ligorio, G. and Sabatini, A. M. (2013). Extended Kalman filter-based methods for pose estimation using visual, inertial and magnetic sensors: comparative analysis and performance evaluation. Sensors, 13(2), 19191941.Google Scholar
Liu, C., Zhou, F., Sun, Y., Di, K. and Liu, Z. (2012). Stereo-image matching using a speeded up robust feature algorithm in an integrated vision navigation system. The Journal of Navigation, 65(4), 671692.Google Scholar
Liu, S., Atia, M. M., Karamat, T. B. and Noureldin, A. (2015). A LiDAR-Aided Indoor Navigation System for UGVs. The Journal of Navigation, 68(02), 253273.Google Scholar
Liu, Y., Xiong, R., Wang, Y., Huang, H., Xie, X., Liu, X. and Zhang, G. (2016). Stereo Visual-Inertial Odometry with Multiple Kalman Filters Ensemble. IEEE Transactions on Industrial Electronics, 63(10), 62056216.Google Scholar
Meng, Q., Liu, J., Zeng, Q., Feng, S. and Chen, R. (2017). Neumann-Hoffman Code Evasion and Stripping Method For BeiDou Software-defined Receiver. The Journal of Navigation, 70(1), 101119.Google Scholar
Natraj, A., Ly, D. S., Eynard, D., Demonceaux, C. and Vasseur, P. (2013). Omnidirectional vision for UAV: Applications to attitude, motion and altitude estimation for day and night conditions. Journal of Intelligent & Robotic Systems, 69(1–4), 459473.Google Scholar
Sanahuja, G. and Castillo, P. (2013). Embedded Laser Vision System for Indoor Aerial Autonomous Navigation. Journal of Intelligent & Robotic Systems, 69(1–4), 447457.Google Scholar
Santoso, F., Garratt, M. A. and Anavatti, S. G. (2017). Visual–Inertial Navigation Systems for Aerial Robotics: Sensor Fusion and Technology. IEEE Transactions on Automation Science and Engineering, 14(1), 260275.Google Scholar
Sun, J., Xu, X., Liu, Y., Zhang, T. and Li, Y. (2016). FOG Random Drift Signal Denoising Based on the Improved AR Model and Modified Sage-Husa Adaptive Kalman Filter. Sensors, 16(7), 1073.Google Scholar
Tarchi, D., Vespe, M., Gioia, C., Sermi, F., Kyovtorov, V. and Guglieri, G. (2017). Low-cost mini radar: design prototyping and tests. Journal of Sensors, Volume 2017, Article ID 8029364, 1–15.Google Scholar
Wang, T., Wang, C., Liang, J., Chen, Y. and Zhang, Y. (2013). Vision-aided inertial navigation for small unmanned aerial vehicles in GPS-denied environments. International Journal of Advanced Robotic Systems, 10(6), 276.Google Scholar
Wang, Y. S., Liu, J. Y., Zeng, Q. H. and Liu, S. (2015). Visual pose measurement based on structured light for MAVs in non-cooperative environments. Optik-International Journal for Light and Electron Optics, 126(24), 54445451.Google Scholar
Wei, Z., Zhou, F. and Zhang, G. (2005). 3D coordinates measurement based on structured light sensor. Sensors and Actuators A: Physical, 120(2), 527535.Google Scholar
Weiss, S., Achtelik, M. W., Lynen, S., Chli, M. and Siegwart, R. (2012). Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments. Proceedings of 2012 IEEE International Conference on Robotics and Automation (ICRA), 957–964, Saint Paul, MN, USA.Google Scholar
Xian, Z., Hu, X. and Lian, J. (2015). Fusing stereo camera and low-cost inertial measurement unit for autonomous navigation in a tightly-coupled approach. The Journal of Navigation, 68(3), 434452.Google Scholar
Zeng, Q. H., Wang, Y. S., Liu, J. Y. and Liu, S. (2017) A matching algorithm for large viewpoint changes images[J]. Optik - International Journal for Light and Electron Optics, 137 (2017) 268278.Google Scholar
Zhang, C. and Kovacs, J. M. (2012). The application of small unmanned aerial systems for precision agriculture: a review. Precision Agriculture, 13(6), 693712.Google Scholar
Zhang, X., Zhan, X. Q., Feng, S. J. and Ochieng, W. (2017). An Analytical Model for BDS B1 Spreading Code Self-Interference Evaluation Considering NH Code Effects. Sensors, 17(4), 663684.Google Scholar