Hostname: page-component-cd9895bd7-fscjk Total loading time: 0 Render date: 2024-12-25T08:56:42.920Z Has data issue: false hasContentIssue false

A method for stereo-vision-based tracking for robotic applications

Published online by Cambridge University Press:  09 June 2009

Pubudu N. Pathirana*
Affiliation:
School of Engineering and IT, Deakin University, Australia
Adrian N. Bishop
Affiliation:
School of Engineering and IT, Deakin University, Australia
Andrey V. Savkin
Affiliation:
School of Electrical Engineering and Telecommunications, University of New South Wales, Australia
Samitha W. Ekanayake
Affiliation:
School of Engineering and IT, Deakin University, Australia
Timothy J. Black
Affiliation:
School of Engineering and IT, Deakin University, Australia
*
*Corresponding author. E-mail: [email protected]

Summary

Vision-based tracking of an object using perspective projection inherently results in non-linear measurement equations in the Cartesian coordinates. The underlying object kinematics can be modelled by a linear system. In this paper we introduce a measurement conversion technique that analytically transforms the non-linear measurement equations obtained from a stereo-vision system into a system of linear measurement equations. We then design a robust linear filter around the converted measurement system. The state estimation error of the proposed filter is bounded and we provide a rigorous theoretical analysis of this result. The performance of the robust filter developed in this paper is demonstrated via computer simulation and via practical experimentation using a robotic manipulator as a target. The proposed filter is shown to outperform the extended Kalman filter (EKF).

Type
Article
Copyright
Copyright © Cambridge University Press 2009

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1.Hespanha, J. M., Yakimenko, O. A., Kaminer, I. I. and Pascoal, A. M., “Linear parametrically varying systems with brief instabilities: An application to vision/inertial navigation,” IEEE Trans. Aerosp. Electron. Syst. 40 (3), 889902 (2004).CrossRefGoogle Scholar
2.Low, E. M. P., Manchester, I. R. and Savkin, A. V., “A biologically inspired method for vision based docking of mobile robots,” Robot. Auton. Syst. 55 (10), 769784 (Oct. 2007).CrossRefGoogle Scholar
3.Manchester, I. R., Savkin, A. V. and Faruqi, F. A., “A method for optical-based precision missile guidance,” IEEE Trans. Aerosp. Electron. Syst. 44 (3), 835851 (2008).CrossRefGoogle Scholar
4.Horswill, I., “Visual Collision Avoidance by Segmentation,” IEEE International Conference on Intelligent Robots and Systems, Munich, Germany (1994) pp. 902909.Google Scholar
5.Pomerleau, D. and Jochem, T., “Rapidly adapting machine vision for automated vehicle steering,” IEEE Expert 11 (2), 1927 (1996).CrossRefGoogle Scholar
6.Nashman, M. and Schneiderman, H., “Real-time visual processing for autonomous driving,” IEEE Symp. Intell. Veh. (1993) pp. 373–378.Google Scholar
7.Horn, B. K., Robot Vision (McGraw-Hill Higher Education, New York, NY, 1986).Google Scholar
8.Pomerleau, D. A., Neural Network Perception for Mobile Robot Guidance (Kluwer Academic, Boston, MA, 1994).Google Scholar
9.Gurfil, P., “Robust guidance for electro-optical missiles,” IEEE Trans. Aerosp. Electron. Syst. 39 (2) (Apr. 2003).CrossRefGoogle Scholar
10.Malyavej, V., Manchester, I. R. and Savkin, A. V., “Precision missile guidance using radar/multiple-video sensor fusion via communication channels with bit-rate constraints,” Automatica 42 (5), 763769 (2006).CrossRefGoogle Scholar
11.Foresti, G. L., Micheloni, C., Snidaro, L., Remagnino, P. and Ellis, T., “Active video-based surveillance system,” IEEE Signal Process. Mag. 22 (2), 2537 (Mar. 2005).CrossRefGoogle Scholar
12.Shibata, M. and Kobayashi, N., Image-based visual tracking for moving targets with active stereo vision robot. Proceedings of the SICE-ICASE International Joint Conference 2006, Bexco, Busan, Korea (Oct. 18–21, 2006) pp. 53295334.CrossRefGoogle Scholar
13.Corke, P. I. and Good, M. C., “Dynamic effects in visual closed-loop systems,” IEEE Trans. Robot. Autom. 12 (5), 671683 (1996).CrossRefGoogle Scholar
14.Driels, M. R. and Pathre, U. S., “Vision-based automatic theodolite for robot calibration,” IEEE Trans. Robot. Autom. 7 (3), 351360 (1991).CrossRefGoogle Scholar
15.Broida, T. J., Chandrashekhar, S. and Chellappa, R., “Recursive 3-D motion estimation from a monocular image sequence,” IEEE Trans. Aerosp. Electron. Syst. 26 (4), 639656 (Jul. 1990).CrossRefGoogle Scholar
16.Blostein, S. D., Zhao, L. and Chann, R. M., “Three-dimensional trajectory estimation from image position and velocity,” IEEE Trans. Aerosp. Electron. Syst. 36 (4), 10751089 (Oct. 2000).Google Scholar
17.Pathirana, P. N., Lim, A., Savkin, A. V. and Hodgson, P. D., “Robust video/ultrasonic fusion based estimations for automotive applications,” IEEE Trans. Veh. Technol. 56 (4), 16311639 (2005).CrossRefGoogle Scholar
18.Lucas, B. and Kanade, T., “An iterative image registration technique with an application to stereo vision,” DARPA Image Understanding Workshop, DARPA (1981) pp. 121130. Washington, DC, USA.Google Scholar
19.Murat, T. A., Digital Video Processing (Prentice Hall PTR, Upper Saddle River, NJ, 1995).Google Scholar
20.Li, X. R. and Jilkov, V. P., “Survey of maneuvering target tracking part I: Dynamic models,” IEEE Trans. Aerosp. Electron. Syst. 39 (4), 13331364 (2003).Google Scholar
21.Respondek, W., Pogromsky, A. and Nijmeijer, H., “Time scaling for observer design with linearizable dynamics,” Automatica 40 (2), 277285 (2004).CrossRefGoogle Scholar
22.Huan, S. and Dissanayake, G., “Convergence and consistency analysis for extended Kalman filter based SLAM,” IEEE Trans. Robot. 23 (5), 10361049 (2007).Google Scholar
23.Julier, S. J. and Uhlmann, J. K., “Unscented filtering and nonlinear estimation,” Proc. IEEE 92 (3) (Mar. 2004).Google Scholar
24.Chann, R. M., Recursive Estimation of 3-D Motion and Structure in Image Sequences Based on Measurement Transformations Master's Thesis (Kingstone, Ontario, Canada: Queen's University, 1994).Google Scholar
25.Zhao, Z., Li, X. R., and Jilkov, V. P., “Best linear unbiased filtering with nonlinear measurements for target tracking,” IEEE Trans. Aerosp. Electron. Syst. 40 (4), 13241336 (2004).CrossRefGoogle Scholar
26.Savkin, A. V. and Petersen, I. R., “Recursive state estimation for uncertain systems with an integral quadratic constraint,” IEEE Trans. Autom. Control 40 (6), 1080 (1995).CrossRefGoogle Scholar
27.Savkin, A. V. and Petersen, I. R., “Model validation for robust control of uncertain systems with an integral quadratic constraint,” Automatica 32 (4), 603606 (1996).CrossRefGoogle Scholar
28.Petersen, I. R. and Savkin, A. V., Robust Kalman Filtering for Signals and Systems with Large Uncertainties (Birkhauser, Boston, MA, 1999).CrossRefGoogle Scholar
29.Petersen, I. R., Ugrinovskii, V. A. and Savkin, A. V., Robust Control Design Using H Methods (Springer-Verlag, London, UK, 2000).CrossRefGoogle Scholar
30.Savkin, A. V. and Evans, R. J., Hybrid Dynamical Systems. Controller and Sensor Switching Problems. (Birkhauser, Boston, MA, 2002).Google Scholar
31.Savkin, A. V., Pathirana, P. N. and Faruqi, F. A., “The problem of precision missile guidance: LQR and H control framework,” IEEE Trans. Aerosp. Electron. Syst. 39 (3), 901910 (2003).CrossRefGoogle Scholar
32.Bar-Shalom, Y. and Li, X. R., Estimation and Tracking Principles, Techniques and Software. (Artech, Norwood, MA, 1993).Google Scholar
33.Anderson, B. D. O and Moore, J. B., Optimal Filtering. (Prentice Hall, Englewood Cliffs, NJ, 1979).Google Scholar
34.Savkin, A. V. and Petersen, I. R., “Robust state estimation and model validation for discrete-time uncertain systems with a deterministic description of noise and uncertanity,” Automatica 34 (2), 271274 (1998).CrossRefGoogle Scholar