Hostname: page-component-586b7cd67f-2plfb Total loading time: 0 Render date: 2024-11-22T16:17:08.294Z Has data issue: false hasContentIssue false

A Novel Approach to Visual Navigation based on Feature Line Correspondences for Precision Landing

Published online by Cambridge University Press:  08 June 2018

Wei Shao
Affiliation:
(College of Automation & Electronic Engineering, Qingdao University of Science and Technology, Qingdao 266042, P.R, China)
Tianhao Gu*
Affiliation:
(College of Automation & Electronic Engineering, Qingdao University of Science and Technology, Qingdao 266042, P.R, China)
Yin Ma
Affiliation:
(College of Automation & Electronic Engineering, Qingdao University of Science and Technology, Qingdao 266042, P.R, China)
Jincheng Xie
Affiliation:
(College of Automation & Electronic Engineering, Qingdao University of Science and Technology, Qingdao 266042, P.R, China)
Liang Cao
Affiliation:
(College of Automation & Electronic Engineering, Qingdao University of Science and Technology, Qingdao 266042, P.R, China)

Abstract

To satisfy the needs of precise pin-point landing missions in deep space exploration, this paper proposes a method based on feature line extraction and matching to estimate the attitude and position of a lander during the descent phase. Linear equations for a lander's motion parameters are given by using at least three feature lines on the planetary surface and their two-dimensional projections. Then, by taking advantage of Singular Value Decomposition (SVD), candidate solutions are obtained. Lastly, the unique lander's attitude and position relative to the landing site are selected from the candidate solutions. Simulation results show that the proposed algorithm is able to estimate a lander's attitude and position robustly and quickly. Without an extended Kalman filter, the average errors of attitude are less than 1° and the average errors of position are less than 10 m at an altitude of 2,000 m. With an extended Kalman filter, attitude errors are within 0·5° and position errors are within 1 m at an altitude of 247·9 m.

Type
Research Article
Copyright
Copyright © The Royal Institute of Navigation 2018 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

REFERENCES

Akinlar, C. and Topal, C. (2011). EDLines: A real-time line segment detector with a false detection control. Pattern Recognition Letters, 32(13), 16331642.Google Scholar
Braun, R. D. and Manning, R. M. (2007). Mars exploration entry, descent, and landing challenges. Journal of Spacecraft and Rockets, 44(2), 310323.Google Scholar
Burns, J. B., Hanson, A. R. and Riseman, E. M. (1986). Extracting straight lines. IEEE Transactions on Pattern Analysis and Machine Intelligence, PAMI-8(4), 425455.Google Scholar
Chang, G., Xu, T. and Wang, Q. (2017a). Error analysis of the 3D similarity coordinate transformation. GPS Solutions, 21(3), 963971.Google Scholar
Chang, G., Xu, T., Wang, Q. and Liu, M. (2017b). Analytical solution to and error analysis of the quaternion-based similarity transformation considering measurement errors in both frames. Measurement, 110, 110.Google Scholar
Cheng, Y., Goguen, J., Johnson, A., Leger, C., Matthies, L., Martin, M. S. and Willson, R. (2004). The Mars exploration rovers descent image motion estimation system. IEEE Intelligent Systems, 19(3), 1321.Google Scholar
Delaune, J., Le Besnerais, G., Voirin, T., Farges, J. L. and Bourdarias, C. (2016). Visual–inertial navigation for pinpoint planetary landing using scale-based landmark matching. Robotics and Autonomous Systems, 78, 6382.Google Scholar
Elqursh, A. and Elgammal, A. (2011). Line-based relative pose estimation. In Computer Vision and Pattern Recognition (CVPR), 2011 IEEE Conference, 30493056.Google Scholar
Etemadi, A. (1992). Robust segmentation of edge data. In Image Processing and its Applications, 1992 International Conference on. IET, 311314.Google Scholar
Grompone von Gioi, R., Jakubowicz, J., Morel, J. M. and Randall, G. (2008). On straight line segment detection. Journal of Mathematical Imaging and Vision, 32(3), 313347.Google Scholar
Grompone von Gioi, R., Jakubowicz, J., Morel, J. M. and Randall, G. (2010). LSD: A fast line segment detector with a false detection control. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(4), 722732.Google Scholar
Johnson, E. A. and Mathies, H. L. (1999). Precise image-based motion estimation for autonomous small body exploration. Artificial Intelligence, Robotics and Automation in Space, 440, 627634.Google Scholar
Johnson, A. E. and Golombek, M. P. (2012). Lander vision system for safe and precise entry descent and landing. In Concepts and Approaches for Mars Exploration, 1679, 1214Google Scholar
Kubota, T., Hashimoto, T., Sawai, S., Kawaguchi, J. I., Ninomiya, K., Uo, M. and Baba, K. (2003). An autonomous navigation and guidance system for MUSES-C asteroid landing. Acta Astronautica, 52(2), 125131.Google Scholar
Kumar, R. and Hanson, A. R. (1994). Robust methods for estimating pose and a sensitivity analysis. CVGIP: Image Understanding, 60(3), 313342.Google Scholar
Li, S. and Cui, P. (2008). Landmark tracking based autonomous navigation schemes for landing spacecraft on asteroids. Acta Astronautica, 62(6), 391403.Google Scholar
Ma, H. and Xu, S. (2014). Only feature point line-of-sight relative navigation in asteroid exploration descent stage. Aerospace Science and Technology, 39, 628638.Google Scholar
Mirzaei, F. M. and Roumeliotis, S. I. (2011). Globally optimal pose estimation from line correspondences. InRobotics and Automation (ICRA), 2011 IEEE International Conference, 55815588.Google Scholar
Panahandeh, G. and Jansson, M. (2014). Vision-aided inertial navigation based on ground plane feature detection. IEEE/ASME Transactions on Mechatronics, 19(4), 12061215.Google Scholar
Qin, T., Zhu, S., Cui, P. and Gao, A. (2014). An innovative navigation scheme of powered descent phase for Mars pinpoint landing. Advances in Space Research, 54(9), 18881900.Google Scholar
Shao, W., Gao, X., Xi, S., Leng, J. and Gu, T. (2016). Attitude and position determination based on craters for precision landing. Proceedings of the Institution of Mechanical Engineers, Part G: Journal of Aerospace Engineering, 230(10), 19341942.Google Scholar
Steltzner, A. D., Miguel San Martin, A., Rivellini, T. P., Chen, A. and Kipp, D. (2014). Mars Science Laboratory Entry, Descent, and Landing System Development Challenges. Journal of Spacecraft and Rockets, 51(4), 9941003.Google Scholar
Umeyama, S. (1991). Least-squares estimation of transformation parameters between two point patterns. IEEE Transactions on Pattern Analysis and Machine Intelligence, 13(4), 376380.Google Scholar
Woicke, S. and Mooij, E. (2018). Terrain Relative Navigation for Planetary Landing Using Stereo Vision Measurements Obtained from Hazard Mapping. In Advances in Aerospace Guidance, Navigation and Control, Springer, Cham, 731751.Google Scholar
Wolf, A. A., Graves, C., Powell, R. and Johnson, W. (2004). Systems for pinpoint landing at Mars. In 14th AIAA/AAS Space Flight Mechanics Meeting.Google Scholar
Zhang, L., Xu, C., Lee, K. M. and Koch, R. (2012a). Robust and efficient pose estimation from line correspondences. In Asian Conference on Computer Vision, 217230.Google Scholar
Zhang, L. and Koch, R. (2013). An efficient and robust line segment matching approach based on LBD descriptor and pairwise geometric consistency. Journal of Visual Communication and Image Representation, 24(7), 794805.Google Scholar
Zhang, X., Zhang, Z., Li, Y., Zhu, X., Yu, Q. and Ou, J. (2012b). Robust camera pose estimation from unknown or known line correspondences. Applied Optics, 51(7), 936948.Google Scholar