Hostname: page-component-6bf8c574d5-t27h7 Total loading time: 0 Render date: 2025-02-22T04:03:33.794Z Has data issue: false hasContentIssue false

Camera network-based visual servoing for aerial interceptor quadrotors

Published online by Cambridge University Press:  19 February 2025

Guojie Wang
Affiliation:
Department of Automation, University of Science and Technology of China, Hefei, China
Qingchen Liu*
Affiliation:
Department of Automation, University of Science and Technology of China, Hefei, China
Qichao Ma
Affiliation:
Department of Automation, University of Science and Technology of China, Hefei, China
Jiahu Qin
Affiliation:
Department of Automation, University of Science and Technology of China, Hefei, China
*
Corresponding author: Qingchen Liu; Email: [email protected]

Abstract

The problem of how to effectively track and intercept small aircraft that break into the no-fly zones is now attracting increasing interest in robotics society. Vision-based control has been proved an effective solution to the target tracking problem for unmanned aerial vehicles (UAVs). Due to the limited field of view (FOV) of onboard vision sensors, existing works assume that the target is always detectable during tracking or limit the flight speed of the UAV in practice. In this paper, inspired by the broad FOV of camera network, we are the first to propose an eye-to-hand (i.e., fixed cameras) visual servoing scheme to track and intercept aerial targets by using UAVs and ground visual sensors. Specifically, utilizing rotation matrices, we first present a visual servoing equation to convert the UAV motion in image planes to the inertial frame. Then, an image-based visual servoing controller is designed directly based on image errors of camera nodes in the sensor network, and system stability is proved by means of Lyapunov analysis. Additionally, to achieve the desired translational velocity command, a low-level attitude controller is developed based on the UAV dynamics. Finally, a series of experiments in both simulated and real flight scenarios show the outstanding efficacy of our method.

Type
Research Article
Copyright
© The Author(s), 2025. Published by Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Floreano, D. and Wood, R. J., “Science, technology and the future of small autonomous drones,” Nature 521(7553), 460466 (2015). doi: 10.1038/nature14542.CrossRefGoogle ScholarPubMed
Farid, A. M., Kuan, L. M., Kamal, M. A. S. and Wong, K., “Effective UAV patrolling for swarm of intruders with heterogeneous behavior,” Robotica 41(6), 16731688 (2023). doi: 10.1017/S0263574723000061.CrossRefGoogle Scholar
Hassanalian, M. and Abdelkefi, A., “Classifications, Applications, and Design Challenges of Drones: A Review,” In: Progress in Aerospace Sciences. vol. 91, (2017) pp. 99131. doi: 10.1016/j.paerosci.2017.04.003.Google Scholar
Xie, H., Lynch, A. F. and Jagersand, M., “Dynamic IBVS of a rotary wing UAV using line features,” Robotica 34(9), 20092026 (2016). doi: 10.1017/S0263574714002707.CrossRefGoogle Scholar
Tang, Z., Cunha, R., Cabecinhas, D., Hamel, T. and Silvestre, C., “Quadrotor going through a window and landing: An image-based visual servo control approach,” Control Eng. Pract. 112, 104827 (2021). doi: 10.1016/j.conengprac.2021.104827.CrossRefGoogle Scholar
Li, J., Ning, Z., He, S., Lee, C.-H. and Zhao, S., “Three-dimensional bearing-only target following via observability-enhanced helical guidance,” IEEE Trans. Robot. 39(2), 15091526 (April 2023). doi: 10.1109/TRO.2022.3218268.CrossRefGoogle Scholar
Zeng, J., Zhong, H., Wang, Y., Fan, S. and Zhang, H., “Autonomous control design of an unmanned aerial manipulator for contact inspection,” Robotica 41(4), 11451158 (2023). doi: 10.1017/S0263574722001588.CrossRefGoogle Scholar
Gao, Y., J. Ji, Q. Wang, R. Jin, Y. Lin, Z. Shang, Y. Cao, S. Shen, C. Xu and F. Gao, “Adaptive tracking and perching for quadrotor in dynamic scenarios,” IEEE Trans. Robot. 40, 499519 (2024). doi: 10.1109/TRO.2023.3335670.CrossRefGoogle Scholar
Hutchinson, S., Hager, G. D. and Corke, P. I., “A tutorial on visual servo control,” IEEE Trans. Robot. Autom. 12(5), 651670 (1996). doi: 10.1109/70.538972.CrossRefGoogle Scholar
Chaumette, F. and Hutchinson, S., “Visual servo control. I. Basic approaches,” IEEE Robot. Autom. Mag. 13(4), 8290 (Dec. 2006). doi: 10.1109/MRA.2006.250573.CrossRefGoogle Scholar
Abdessameud, A. and Janabi-Sharifi, F., “Image-based tracking control of VTOL unmanned aerial vehicles,” Automatica 53, 111119 (2015). doi: 10.1016/j.automatica.2014.12.032.CrossRefGoogle Scholar
Asl, H. J., “Robust vision-based tracking control of VTOL unmanned aerial vehicles,” Automatica 107, 425432 (2019). doi: 10.1016/j.automatica.2019.06.004.Google Scholar
Zheng, D., Wang, H., Wang, J., Chen, S., Chen, W. and Liang, X., “Image-based visual servoing of a quadrotor using virtual camera approach,” IEEE/ASME Trans. Mechatron. 22(2), 972982 (April 2017). doi: 10.1109/TMECH.2016.2639531.CrossRefGoogle Scholar
Li, J., Xie, H., Low, K. H., Yong, J. and Li, B., “Image-based visual servoing of rotorcrafts to planar visual targets of arbitrary orientation,” IEEE Robot. Autom. Lett. 6(4), 78617868 (Oct. 2021). doi: 10.1109/LRA.2021.3101878.CrossRefGoogle Scholar
Yang, K. and Quan, Q., “An Autonomous Intercept Drone with Image-based Visual Servo,” In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), pp. 22302236 (2020). doi: 10.1109/ICRA40945.2020.9197539.CrossRefGoogle Scholar
Wang, G., Qin, J., Liu, Q., Ma, Q. and Zhang, C., “Image-based visual servoing of quadrotors to arbitrary flight targets,” IEEE Robot. Autom. Lett. 8(4), 20222029 (April 2023). doi: 10.1109/LRA.2023.3245416.CrossRefGoogle Scholar
Heshmati-alamdari, S., Karras, G. C., Eqtami, A. and Kyriakopoulos, K. J.. “A Robust Self Triggered Image Based Visual Servoing Model Predictive Control Scheme for Small Autonomous Robots.” In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany (2015) pp. 54925497. doi: 10.1109/IROS.2015.7354155 Google Scholar
Miao, Z., Zhong, H., Lin, J., Wang, Y., Chen, Y. and Fierro, R., “Vision-based formation control of mobile robots with FOV constraints and unknown feature depth,” IEEE Trans. Contr. Syst. Technol. 29(5), 22312238 (Sept. 2021). doi: 10.1109/TCST.2020.3023415.CrossRefGoogle Scholar
Zheng, D., Wang, H., Chen, W. and Wang, Y., “Planning and tracking in image space for image-based visual servoing of a quadrotor,” IEEE Trans. Ind. Electron. 65(4), 33763385 (April 2018). doi: 10.1109/TIE.2017.2752124.CrossRefGoogle Scholar
Shi, H., Shi, L., Sun, G. and Hwang, K.-S., “Adaptive image-based visual servoing for hovering control of quad-rotor,” IEEE Trans. Cogn. Dev. Syst. 12(3), 417426 (Sept. 2020). doi: 10.1109/TCDS.2019.2908923.CrossRefGoogle Scholar
Jin, Z., Wu, J., Liu, A., Zhang, W.-A. and Yu, L., “Policy-based deep reinforcement learning for visual servoing control of mobile robots with visibility constraints,” IEEE Trans. Ind. Electron. 69(2), 18981908 (Feb. 2022). doi: 10.1109/TIE.2021.3057005.CrossRefGoogle Scholar
Akyildiz, I. F., Melodia, T. and Chowdhury, K. R., “A survey on wireless multimedia sensor networks,” Comput. Netw. 51(4), 921960 (Mar. 2007). doi: 10.1016/j.comnet.2006.10.002.CrossRefGoogle Scholar
Cheng, T. M. and Savkin, A. V., “Decentralized control for mobile robotic sensor network self-deployment: Barrier and sweep coverage problems,” Robotica 29(2), 283294 (2011). doi: 10.1017/S0263574710000147.CrossRefGoogle Scholar
Song, B., Ding, C., Kamal, A. T., Farrell, J. A. and Roy-chowdhury, A. K., “Distributed camera networks,” IEEE Signal Proc. Mag. 28(3), 2031 (May 2011). doi: 10.1109/MSP.2011.940441.CrossRefGoogle Scholar
Nguyen, L., Kodagoda, S., Ranasinghe, R. and Dissanayake, G., “Mobile robotic sensors for environmental monitoring using Gaussian Markov random field,” Robotica 39(5), 862884 (2021). doi: 10.1017/S026357472000079X.CrossRefGoogle Scholar
Zhang, C., Qin, J., Li, H., Wang, Y., Wang, S. and Zheng, W. X., “Consensus-based distributed two-target tracking over wireless sensor networks,” Automatica 146, 110593 (2022). doi: 10.1016/j.automatica.2022.110593.CrossRefGoogle Scholar
Zheng, Y., Zheng, C., Zhang, X., Chen, F., Chen, Z. and Zhao, S., “Detection, localization, and tracking of multiple MAVs with panoramic stereo camera networks,” IEEE Trans. Autom. Sci. Eng. 20(2), 12261243 (April 2023). doi: 10.1109/TASE.2022.3176294.CrossRefGoogle Scholar
Barnich, O. and Van Droogenbroeck, M., “ViBe: A universal background subtraction algorithm for video sequences,” IEEE Trans. Image Process. 20(6), 17091724 (June 2011). doi: 10.1109/TIP.2010.2101613.CrossRefGoogle Scholar
Mahony, R., Kumar, V. and Corke, P., “Multirotor aerial vehicles: Modeling, estimation, and control of quadrotor,” IEEE Robot. Autom. Mag. 19(3), 2032 (Sept. 2012). doi: 10.1109/MRA.2012.2206474.CrossRefGoogle Scholar
Khalil, H. K.. Nonlinear Systems. 3rd edition, Prentice Hall, Upper Saddle River, NJ, USA, (2002).Google Scholar
Lee, T., Leok, M. and McClamroch, N. H.. “Geometric Tracking Control of a Quadrotor UAV on SE(3).” In: Proceedings of 49th IEEE Conference on Decision and Control (CDC), Atlanta, GA, USA (2010) pp. 54205425. doi: 10.1109/CDC.2010.5717652 CrossRefGoogle Scholar
Morgan, Q., Conley, K., Gerkey, B., Faust, J., Foote, T., Leibs, J., Wheeler, R. and Ng, A. Y.. “ROS: An Open-source Robot Operating System.” In: Proceedings of ICRA Workshop on Open Source Software, vol. 3 (2009) pp. 16.Google Scholar
Meier, L., Honegger, D. and Pollefeys, M., “PX4: A Node-based Multithreaded Open Source Robotics Framework for Deeply Embedded Platforms.” In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), pp. 62356240 (2015). doi: 10.1109/ICRA.2015.7140074.CrossRefGoogle Scholar