Hostname: page-component-cd9895bd7-gvvz8 Total loading time: 0 Render date: 2024-12-23T19:23:21.809Z Has data issue: false hasContentIssue false

Accurate 3D Localization Using RGB-TOF Camera and IMU for Industrial Mobile Robots

Published online by Cambridge University Press:  22 February 2021

Majid Yekkehfallah
Affiliation:
Department of Automation, Shanghai Jiao Tong University, Shanghai200240, China E-mails: [email protected], [email protected], [email protected], [email protected] Key Laboratory of System Control and Information Processing, Ministry of Education of China, Shanghai200240, China
Ming Yang*
Affiliation:
Department of Automation, Shanghai Jiao Tong University, Shanghai200240, China E-mails: [email protected], [email protected], [email protected], [email protected] Key Laboratory of System Control and Information Processing, Ministry of Education of China, Shanghai200240, China
Zhiao Cai
Affiliation:
Department of Automation, Shanghai Jiao Tong University, Shanghai200240, China E-mails: [email protected], [email protected], [email protected], [email protected] Key Laboratory of System Control and Information Processing, Ministry of Education of China, Shanghai200240, China
Liang Li
Affiliation:
Department of Automation, Shanghai Jiao Tong University, Shanghai200240, China E-mails: [email protected], [email protected], [email protected], [email protected]
Chuanxiang Wang
Affiliation:
Department of Automation, Shanghai Jiao Tong University, Shanghai200240, China E-mails: [email protected], [email protected], [email protected], [email protected] Key Laboratory of System Control and Information Processing, Ministry of Education of China, Shanghai200240, China
*
*Corresponding author. E-mail: [email protected]

Summary

Localization based on visual natural landmarks is one of the state-of-the-art localization methods for automated vehicles that is, however, limited in fast motion and low-texture environments, which can lead to failure. This paper proposes an approach to solve these limitations with an extended Kalman filter (EKF) based on a state estimation algorithm that fuses information from a low-cost MEMS Inertial Measurement Unit and a Time-of-Flight camera. We demonstrate our results in an indoor environment. We show that the proposed approach does not require any global reflective landmark for localization and is fast, accurate, and easy to use with mobile robots.

Type
Article
Copyright
© The Author(s), 2021. Published by Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Ronzoni, D., Olmi, R., Secchi, C. and Fantuzzi, C., “AGV Global Localization Using Indistinguishable Artificial Landmarks,” 2011 IEEE International Conference on Robotics and Automation (IEEE, 2011).CrossRefGoogle Scholar
Lee, S.-Y. and Yang, H.-W., “Navigation of automated guided vehicles using magnet spot guidance method,” Robot. Comput. Integr. Manuf. 28(3), 425436 (2012).CrossRefGoogle Scholar
Chumkamon, S., Tuvaphanthaphiphat, P. and Keeratiwintakorn, P., “A Blind Navigation System Using RFID for Indoor Environments,” 2008 5th International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology, vol. 2 (IEEE, 2008).CrossRefGoogle Scholar
Qian, C., Yang, M., Qi, M., Wang, C. and Wang, B., “Swinging single-layer LiDAR based dense point cloud map reconstruction system for large-scale scenes,” Jiqiren/Robot 41(4), 464492 (2019).Google Scholar
Qi, M., Yang, M., Wang, C. and Wang, B., “3D map reconstruction in indoor environment based on normal distribution transformation under gravity constraint,” Shanghai Jiaotong Daxue Xuebao/J. Shanghai Jiaotong Univ. 52(1), 2632 (2018).Google Scholar
Henry, P., Krainin, M., Herbst, E., Ren, X. and Fox, D., “RGB-D mapping: Using Kinect-style depth cameras for dense 3D modeling of indoor environments,” Int. J. Robot. Res. 31(5), 647663 (2012).CrossRefGoogle Scholar
Endres, F., Hess, J., Sturm, J., Cremers, D. and Burgard, W., “3-D mapping with an RGB-D camera,” IEEE Trans. Robot. 30(1), 177187 (2013).CrossRefGoogle Scholar
Moore, T. and Stouch, D., “A Generalized Extended Kalman Filter Implementation for the Robot Operating System,” In: Intelligent Autonomous Systems, vol. 13 (Springer, Cham, 2016) pp. 335348.CrossRefGoogle Scholar
Fischler, M. A. and Bolles, R. C., “Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography,” Commun. ACM 24(6), 381395 (1981).CrossRefGoogle Scholar
Hahnel, D., Burgard, W., Fox, D., Fishkin, K. and Philipose, M., “Mapping and Localization with RFID Technology,” IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA’04, vol. 1 (IEEE, 2004).CrossRefGoogle Scholar
Song, Z., Wu, X., Xu, T., Sun, J., Gao, Q. and He, Y., “A New Method of AGV Navigation Based on Kalman Filter and a Magnetic Nail Localization,” 2016 IEEE International Conference on Robotics and Biomimetics (ROBIO) (IEEE, 2016).CrossRefGoogle Scholar
Chen, C., Yang, B. S. and Song, S., “Low Cost and Efficient 3D Indoor Mapping Using Multiple Consumer Rgb-D Cameras,” In: International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. 41 (2016).Google Scholar
Abeywardena, D., Huang, S., Barnes, B., Dissanayake, G. and Kodagoda, S., “Fast, On-Board, Model-Aided Visual-Inertial Odometry System for Quadrotor Micro Aerial Vehicles,” 2016 IEEE International Conference on Robotics and Automation (ICRA) (IEEE, 2016).CrossRefGoogle Scholar
Hess, W., Kohler, D., Rapp, H. and Andor, D., “Real-Time Loop Closure in 2D LIDAR SLAM,” 2016 IEEE International Conference on Robotics and Automation (ICRA) (IEEE, 2016).CrossRefGoogle Scholar
Li, L., Yang, M., Wang, C. and Wang, B., “Hybrid filtering framework based robust localization for industrial vehicles,” IEEE Trans. Ind. Inform. 14(3), 941950 (2017).CrossRefGoogle Scholar
Qin, T., Li, P. and Shen, S., “Vins-mono: A robust and versatile monocular visual-inertial state estimator,” IEEE Trans. Robot. 34(4), 10041020 (2018).CrossRefGoogle Scholar
Sun, K., Mohta, K., Pfrommer, B., Watterson, M., Liu, S., Mulgaonkar, Y., Taylor, C. J. and Kumar, V., “Robust stereo visual inertial odometry for fast autonomous flight,” IEEE Robot. Autom. Lett. 3(2), 965972 (2018).CrossRefGoogle Scholar
Bloesch, M., Burri, M., Omari, S., Hutter, M. and Siegwart, R., “Iterated extended Kalman filter based visual-inertial odometry using direct photometric feedback,” Int. J. Robot. Res. 36(10), 10531072 (2017).CrossRefGoogle Scholar
Cai, Z., Yang, M., Wang, C. and Wang, B., “Monocular Visual-Inertial Odometry Based on Sparse Feature Selection with Adaptive Grid,” 2018 IEEE Intelligent Vehicles Symposium (IV) (IEEE, 2018) pp. 18421847.CrossRefGoogle Scholar
Leutenegger, S., Lynen, S., Bosse, M., Siegwart, R. and Furgale, P., “Keyframe-based visual–inertial odometry using nonlinear optimization,” Int. J. Robot. Res. 34(3), 314334 (2015).CrossRefGoogle Scholar
Calonder, M., Lepetit, V., Strecha, C. and Fua, P., “Brief: Binary Robust Independent Elementary Features,” European Conference on Computer Vision (Springer, Berlin, Heidelberg, 2010).CrossRefGoogle Scholar
Shi, J., “Good Features to Track,” 1994 Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (IEEE, 1994).Google Scholar
Bay, H., Ess, A., Tuytelaars, T. and Van Gool, L., “Speeded-up robust features (SURF),” Comput. Vis. Image Understanding 110(3), 346359 (2008).CrossRefGoogle Scholar
Lowe, D. G., “Distinctive image features from scale-invariant keypoints,” Int. J. Comput. Vis. 60(2), 91110 (2004).CrossRefGoogle Scholar
Simon, D., Optimal State Estimation: Kalman, H Infinity, and Nonlinear approaches (John Wiley and Sons, Hobokben, NJ, 2006).CrossRefGoogle Scholar
Hazry, D., Rosbi, M. and Sofian, M., “Study of inertial measurement unit sensor,” Proceedings of the International Conference on Man-Machine Systems (ICoMMS), 11–13 October 2009, (Batu Ferringhi, Penang, Malaysia, 2009).Google Scholar
Maybeck, P. S., Stochastic Models, Estimation, and Control, vol. 3 (Academic Press, New York, 1982).Google Scholar
Grewal, M. S. and Andrews, A. P., Kalman Filtering: Theory and Practice (Prentice-Hall, Englewood Cliffs, NJ, 1995/1983).Google Scholar
Labbé, and Michaud, F., “RTAB-Map as an open-source lidar and visual simultaneous localization and mapping library for large scale and long-term online operation,” J. Field Robot. 36(2), 416446 (2019).CrossRefGoogle Scholar
Sturm, J., Engelhard, N., Endres, F., Burgard, W. and Cremers, D., “A Benchmark for the Evaluation of RGB-D SLAM Systems,” 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems (IEEE, 2012).CrossRefGoogle Scholar