Hostname: page-component-586b7cd67f-t7czq Total loading time: 0 Render date: 2024-11-21T19:15:37.876Z Has data issue: false hasContentIssue false

A novel precise pose prediction algorithm for setting the sleeping mode of the Yutu-2 rover based on a multiview block bundle adjustment

Published online by Cambridge University Press:  02 May 2022

Song Peng
Affiliation:
Beijing Institute of Spacecraft System Engineering, Beijing, 100094, China
Youqing Ma*
Affiliation:
International Research Center of Big Data for Sustainable Development Goals, Beijing, 100094, China Key Laboratory of Digital Earth Science, Aerospace Information Research Institute, Chinese Academy of Sciences, Beijing, 100094, China
Xinchao Xu
Affiliation:
School of Geomatics, Liaoning Technical University, Fuxin, Liaoning, 123000, China
Yang Jia
Affiliation:
Beijing Institute of Spacecraft System Engineering, Beijing, 100094, China
Shaochuang Liu
Affiliation:
International Research Center of Big Data for Sustainable Development Goals, Beijing, 100094, China Key Laboratory of Digital Earth Science, Aerospace Information Research Institute, Chinese Academy of Sciences, Beijing, 100094, China
*
*Corresponding author. E-mail: [email protected]

Abstract

To set the sleeping mode for the Yutu-2 rover, a visual pose prediction algorithm including terrain reconstruction and pose estimation was first studied. The terrain reconstruction precision is affected by using only the stereo navigation camera (Navcam) images and the rotation angles of the mast. However, the hazard camera (Hazcam) pose is fixed, and an image network was constructed by linking all of the Navcam and Hazcam stereoimages. Then, the Navcam pose was refined based on a multiview block bundle adjustment. The experimental results show that the mean absolute errors of the check points in the proposed algorithm were 10.4 mm over the range of $\boldsymbol{L}$ from 2.0 to 6.1 m, and the proposed algorithm achieved good prediction results for the rover pose (the average differences of the values of the pitch angle and the roll angle were −0.19 degrees and 0.29 degrees, respectively). Under the support of the proposed algorithm, engineers have completed the remote setting of the sleeping mode for Yutu-2 successfully in the Chang’e-4 mission operations.

Type
Research Article
Copyright
© The Author(s), 2022. Published by Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Ma, Y. Q., Liu, S. C., Sima, B., Wen, B., Peng, S. and Jia, Y., “A precise visual localisation method for the Chinese Chang’e-4 Yutu-2 rover,” Photogramm. Rec. 35(169), 1039 (2020). doi: 10.1111/PHOR.12309.CrossRefGoogle Scholar
Ma, Y. Q., Peng, S., Jia, Y. and Liu, S. C., “Prediction of terrain occlusion in Change-4 mission,” Measurement, 152 107368, 118 (2020). doi: 10.1016/j.measurement.2019.107368.CrossRefGoogle Scholar
Wang, Q. and Liu, J., “A Chang’e-4 mission concept and vision of future Chinese lunar exploration activities,” Acta Astronaut. 127, 678683 (2016). doi: 10.1016/j.actaastro.2016.06.024.CrossRefGoogle Scholar
Wang, Y. X., Wan, W. H., Gou, S., M.Peng, Z. Q. L., Di, K. C., Li, L. C., Yu, T., Wang, J. and Cheng, X., “Vision-based decision support for rover path planning in the Chang’e-4 mission,” Remote Sens. 12(624), 116 (2020). doi: 10.3390/rs12040624.CrossRefGoogle Scholar
Geromichalos, D., Azkarate, M., Tsardoulias, E., Gerdes, L., Petrou, L. and Pulgar, C., “SLAM for autonomous planetary rovers with global localization,” J. Field Robot. 37(5), 118 (2020). doi: 10.1002/rob.21943.CrossRefGoogle Scholar
Li, R. X., Squyres, S. W., Arvidson, R. E., Archinal, B. A., Bell, J., Cheng, Y. and Golombek, M., “Initial results of rover localization and topographic mapping for the 2003 mars exploration rover mission,” Photogramm. Eng. Remote Sens. 71(10), 11291142 (2005). doi: 10.14358/PERS.71.10.1129.CrossRefGoogle Scholar
Di, K. C., Xu, F., Wang, J., Agarwal, S., Brodyagina, E., Li, R. and Matthies, L., “Photogrammetric processing of rover imagery of the 2003 mars exploration rover mission,” Isprs J. Photogramm. Remote Sens. 63(2), 181201 (2008). doi: 10.1016/j.isprsjprs.2007.07.007.CrossRefGoogle Scholar
Liu, Z. Q., Di, K. C., Peng, M., Wan, W. H., Liu, B., Li, L. C. and H. Chen, “High precision landing site mapping and rover localization for Chang’e-3 mission,” Sci. China-Phys. Mech. Astron. 58(1), 111 (2015). doi: 10.1007/s11433-014-5612-0.Google Scholar
Qing, M. Y., Chuang, L. S., Bo, W., Shuo, Z., Lei, L. M. and Song, P., “Weighted total least squares for the visual localization of a planetary rover,” Photogramm. Eng. Remote Sens. 84(10), 605618 (2018). doi: 10.14358/PERS.84.10.605.CrossRefGoogle Scholar
Zhang, J., Xia, Y. and Shen, G., “A novel learning-based global path planning algorithm for planetary rovers,” Neurocomputing 361(1), 6976 (2019). doi: 10.1016/j.neucom.2019.05.075.CrossRefGoogle Scholar
Otsu, K., Matheron, G., Ghosh, S., Toupet, O. and Ono, M., “Fast approximate clearance evaluation for rovers with articulated suspension systems,” J. Field Robot. 37(5), 768785 (2020). doi: 10.1002/rob.21892.CrossRefGoogle Scholar
Maimone, M., Biesiadecki, J., Tunstel, E., Cheng, Y., Cheng, Y. and Leger, C. “Surface Navigation and Mobility Intelligence on the Mars Exploration Rovers,” In: Intelligence for Space Robotics (A. Howard and E. Tunstel, eds.) (2006) pp. 4569.Google Scholar
Krüsi, P., Furgale, P., Bosse, M. and Siegwart, R., “Driving on point clouds: motion planning, trajectory optimization, and terrain assessment in generic nonplanar environments,” J. Field Robot. 34(5), 940984 (2017). doi: 10.1002/rob.21700.CrossRefGoogle Scholar
Ono, T., Kumamoto, A., Nakagawa, H., Yamaguchi, Y., Oshigami, S., Yamaji, A. and Oya, H., “Lunar radar sounder observations of subsurface layers under the nearside maria of the Moon,” Science 323(5916), 909912 (2009). doi: 10.1126/science.1165988.CrossRefGoogle Scholar
Riris, H., Sun, X., Cavanaugh, J. F., Ramos-Izquierdo, L., Liiva, P., Jackson, G. B. and Smith, D. E., “The Lunar Orbiter Laser Altimeter (LOLA) on Nasa’s Lunar Reconnaissance Orbiter (LRO) Mission,” In: Conference on Lasers and Electro-Optics (p. CMQ1) (Optical Society of America, San Jose, California, United States, 2008). doi: 10.1109/CLEO.2008.4551271.CrossRefGoogle Scholar
Wu, B., Li, F., Hu, H., Zhao, Y. and Zhang, H., “Topographic and geomorphological mapping and analysis of the change-4 landing site on the far side of the moon,” Photogramm. Eng. Remote Sens. 86(4), 247258 (2020). doi: 10.14358/PERS.86.4.247.CrossRefGoogle Scholar
Di, K. C. and Li, R. X., “CAHVOR camera model and its photogrammetric conversion for planetary applications,” J. Geophys. Res.-Planets 109(E4), 1191 (2004). doi: 10.1029/2003JE002199.CrossRefGoogle Scholar
Maki, J., Thiessen, D., Pourangi, A., Kobzeff, P., Litwin, T., Scherr, L., Elliott, S., Dingizian, A. and Maimone, M., “The Mars science laboratory engineering cameras,” Space Sci. Rev. 170(1-4), 7793 (2012). doi: 10.1007/s11214-012-9882-4.CrossRefGoogle Scholar
Zhang, S., Liu, S., Ma, Y., Qi, C., Ma, H. and Yang, H., “Self calibration of the stereo vision system of the change-3 lunar rover based on the bundle block adjustment, ISPRS-J,” Photogramm. Remote Sens. 128(5), 287297 (2017). doi: 10.1016/j.isprsjprs.2017.04.004.CrossRefGoogle Scholar
Ni, T., Li, W., Zhang, H., Yang, H. and Kong, Z., “Pose prediction of autonomous full tracked vehicle based on 3D sensor,” Sensors 19(23), 5120 (2019). doi: 10.3390/s19235120.CrossRefGoogle ScholarPubMed
Ma, Y. and Shiller, Z., Pose estimation of vehicles over uneven terrain(March 2019), https://arxiv.org/abs/1903.02052.Google Scholar
Jordan, J. and Zell, A., “Real-Time Pose Estimation on Elevation Maps for Wheeled Vehicles,” In: 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE, Vancouver, BC, Canada, 2017) pp. 13371342, doi: 10.1109/IROS.2017.8202311.CrossRefGoogle Scholar
Papadakis, P., “Terrain traversability analysis methods for unmanned ground vehicles: A survey,” Eng. Appl. Artif. Intell. 26(4), 13731385 (2013). doi: 10.1016/j.engappai.2013.01.006.CrossRefGoogle Scholar
Zhang, S., Jia, Y., Peng, S., Wen, B., Ma, Y. Q., Qi, C. and Liu, S. C., “Self-Calibration of the stereo vision system of the Change-4 lunar rover based on the points and lines combined adjustment,” Photogramm. Eng. Remote Sens. 86(3), 169176 (2020). doi: 10.14358/PERS.86.3.169.CrossRefGoogle Scholar
Di, K. C. and Li, R. X., “Topographic Mapping Capability Analysis of Mars Exploration Rover 2003 Mission Imagery,” In: Proceedings of the 5th International Symposium on Mobile Mapping Technology, MMT, Padua, Italy, 2007).Google Scholar
Lowe, D. G., “Distinctive image features from Scale-Invariant keypoints,” Int J. Comput. Vis. 60(2), 91110 (2004). doi: 10.1023/b:visi.0000029664.99615.94.CrossRefGoogle Scholar
Geiger, A., Roser, M. and Urtasun, R.. “Efficient Large-Scale Stereo Matching,” In: Asian Conference on Computer Vision (Springer, Berlin, Heidelberg, 2010) pp. 2538, doi: 10.1007/978-3-642-19315-6_3.Google Scholar
Yuan, X., “Quality assessment for gps-supported bundle block adjustment based on aerial digital frame imagery,” Photogramm. Rec. 24(126), 139156 (2009). doi: 10.1111/j.1477-9730.2009.00527.x.CrossRefGoogle Scholar
Ma, Y. Q., Liu, S. C. and Li, Q., “An advanced multiple outlier detection algorithm for 3D similarity datum transformation,” Measurement 163(7), 107945 (2020). doi: 10.1016/j.measurement.2020.107945.CrossRefGoogle Scholar
Gu, T., Tu, Y., Tang, D. and Luo, T., “A robust moving total least squares fitting method for measurement data,” IEEE Trans. Instrum. Meas. 69(10), 75667573 (2020). doi: 10.1109/TIM.2020.2986106.CrossRefGoogle Scholar