Hostname: page-component-78c5997874-94fs2 Total loading time: 0 Render date: 2024-11-05T22:25:56.939Z Has data issue: false hasContentIssue false

Vision guidance of a fixed wing UAV using a single camera configuration

Published online by Cambridge University Press:  27 January 2016

P.-T. Iong*
Affiliation:
Jetpro Technology, Tainan City, Taiwan
S.-H. Chen
Affiliation:
Department of Aeronautics and Astronautics, National Cheng Kung University, Tainan City, Taiwan
Y. Yang
Affiliation:
Department of Aeronautics and Astronautics, National Cheng Kung University, Tainan City, Taiwan

Abstract

In this paper a single camera vision guidance system for fixed wing UAV is developed. This system searches for and identifies a target object with known colour and shape from images captured by an onboard camera. HSV colour space and moment invariants are utilised to describe the colour and shape features of the target object. Position, area and rotation angle of the target object in the image plane are collected. This information is then processed by the Extended Kalman Filter to estimate the relative positions and attitudes of the UAV. The vision guidance system guides the UAV towards the target object automatically based on these estimated states by using a proportional controller. A Senior Telemaster aircraft model kit installed with an onboard camera and computer is used for flight test. The target object for the flight test is a white flag with a red cross. Flight simulations and flight tests results are presented in this paper, showing that the vision guidance system can recognise the target object and guide the UAV effectively.

Type
Research Article
Copyright
Copyright © Royal Aeronautical Society 2013 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1. Sharp, C.S., Shakernia, O. and Sastry, S.S. A vision system for landing an unmanned aerial vehicle, 2001, IEEE International Conference on Robotics and Automation, Seoul, Korea, pp 17201727.Google Scholar
2. Shakernia, O., Sharp, C.S., Vidal, R., Shim, D.H., Ma, Y. and Sastry, S. Multiple view motion estimation and control for landing an unmanned aerial vehicle, 2002, IEEE Conference of Robotics and Automation, Washington DC, USA, pp 27932798.Google Scholar
3. Saripalli, S. Visually guided landing of an unmanned aerial vehicle, IEEE Transactions on Robotics and Automation, 2003, 19, (3), pp 371381.Google Scholar
4. Saripalli, S. and Sukhatme, G.S. Landing on a moving target using an autonomous helicopter, 2003, International Conference on Field and Service Robotics, Mt Fuji, Japan.Google Scholar
5. Wu, A.D., Johnson, E.N. and Proctor, A.A. Vision-aided inertial navigation for flight control, J Aerospace Computing, Information, and Communication, 2005, 2, (9), pp 348360.Google Scholar
6. Ettinger, S.M., Nechyba, M.C., Ifju, P.G. and Waszak, M. Vision-guided flight stability and control for micro air vehicles, 2002, 2002 IEEE/RSJ International Conference on Intelligent Robots and Systems EPFL, Lausanne, Switzerland, pp 21342140.Google Scholar
7. Kehoe, J.J., Causey, R., Abdulrahim, M. and Lind, R. Waypoint navigation for a micro air vehicle using vision-based attitude estimation, 2005, AIAA Guidance, Navigation, and Control Conference, San Francisco, USA.Google Scholar
8. Wagter, C.D., Proctor, A.A. and Johnson, E.N. Vision-only aircraft flight control, 2003, AIAA Digital Avionics Conference, Indianapolis, USA.Google Scholar
9. Proctor, A.A. and Johnson, E.N. Vision-only aircraft flight control methods and test results, 2004, AIAA Guidance, Navigation, and Control Conference and Exhibition, Providence, Rhode Island, USA.Google Scholar
10. Proctor, A.A. and Johnson, E.N. Vision-only approach and landing, 2005, AIAA Guidance, Navigation and Control Conference and Exhibition, San Francisco, California, USA.Google Scholar
11. Oh, S.M. and Johnson, E.N. Relative motion estimation for vision-based formation flight using unscented Kalman filter, 2007, AIAA Guidance, Navigation and Control Conference and Exhibition, Hilton Head, California, USA.Google Scholar
12. Smith, A.R. Color gamut transform pairs, ACM SIGGRAPH Computer Graphics archive, 12, (3), 1978, pp 1219.Google Scholar
13. Hu, M.K. Visual pattern recognition by moment invariants, IRE Transactions on Information Theory, IT-8, 1962, pp 179187.Google Scholar
14. Rosenfeld, A. and Pfaltz, J.L. Sequential operations in digital picture processing, J Association for Computing Machinery, 1966, 13, pp 471494.Google Scholar
15. Hsia, T.C. A note on invariant moments in image processing, IEEE Transactions on Systems, Man and Cybernetics, 1981, SMC-11, (12), pp 831834.Google Scholar
16. Teh, C.H. and Chin, R.T. On digital approximation of moment invariants, Computer Vision, Graphics, and Image Processing, 1986, 33, pp 318326.Google Scholar
17. Haralick, R.M. and Shapiro, L.G. Computer and Robot Vision, 1992, Addison-Wesley Longman Publishing Co, Boston, USA.Google Scholar
18. Heikkilä, J. and Silvén, O. A four-step camera calibration procedure with implicit image correction, 1997, IEEE Computer Vision and Pattern Recognition, Puerto Rico, pp 11061112.Google Scholar
19. Bouguet, J.Y. Camera Calibration Toolbox for Matlab, URL: http://www.vision.caltech.edu/bouguetj/calib_doc/ [cited 14 April 2006].Google Scholar