Hostname: page-component-586b7cd67f-t7czq Total loading time: 0 Render date: 2024-11-25T08:18:53.695Z Has data issue: false hasContentIssue false

Faulty robot rescue by multi-robot cooperation

Published online by Cambridge University Press:  29 May 2013

Gyuho Eoh*
Affiliation:
Department of Electrical Engineering, Seoul National University (ASRI), Seoul, Republic of Korea
Jeong S. Choi
Affiliation:
Department of Electrical Engineering, Seoul National University (ASRI), Seoul, Republic of Korea
Beom H. Lee
Affiliation:
Department of Electrical Engineering, Seoul National University (ASRI), Seoul, Republic of Korea
*
*Corresponding author. E-mail: [email protected]

Summary

This paper presents a multi-agent behavior to cooperatively rescue a faulty robot using a sound signal. In a robot team, the faulty robot should be immediately recalled since it may seriously obstruct other robots, or collected matters in the faulty robot may be lost. For the rescue mission, we first developed a sound localization method, which estimates the sound source from a faulty robot by using multiple microphone sensors. Next, since a single robot cannot recall the faulty robot, the robots organized a heterogeneous rescue team by themselves with pusher, puller, and supervisor. This self-organized team succeeded in moving the faulty robot to a safe zone without help from any global positioning systems. Finally, our results demonstrate that a faulty robot among multi-agent robots can be immediately rescued with the cooperation of its neighboring robots and interactive communication between the faulty robot and the rescue robots. Experiments are presented to test the validity and practicality of the proposed approach.

Type
Articles
Copyright
Copyright © Cambridge University Press 2013 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1.Li, H., Ishikawa, S., Zhao, Q., Ebana, M., Yamamoto, H. and Huang, J., “Robot Navigation and Sound-Based Position Identification,” In: Proceedings of the IEEE International Conference on Systems, Mans and Cybernetics, Montreal (2007) pp. 24492454.Google Scholar
2.Pérez, J. A., Castellanos, J. A., Montiel, J. M. M., Neira, J. and Tardos, J. D., “Continuous Mobile Robot Localization: Vision vs. Laser,” In: Proceedings of the IEEE Conference on Robotics and Automation, Detroit, Michigan (1999) pp. 29172923.Google Scholar
3.Dellaert, F., Burgard, W., Fox, D. and Thrun, S., “Using the condensation algorithm for robust, vision-based mobile robot localization,” In: Proceedings of the IEEE International Conference on Computer Vision and Pattern Recognition II, Fort Collins, Colorado (1999) pp. 588594.Google Scholar
4.Se, S., Lowe, D. and Little, J., “Global Localization Using Distinctive Visual Features,” In: Proceedings of the International Conference on Intelligent Robots and Systems, Lausanne, Switzerland (2002) pp. 226231.Google Scholar
5.Yim, B. D., Hwang, S. Y. and Song, J. B., “Mobile Robot Localization Based on Fusion of Vision and Range Information,” In: Control and Automation System Symposium, Ilsan, Republic of Korea (2006) pp. 183188.Google Scholar
6.Little, J. J., Lu, J. and Murray, D. R., “Selecting Stable Image Features for Robot Localization Using Stereo,” In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Victoria, B.C., Canada (1998) pp. 10721077.Google Scholar
7.Stancliff, S., Dolan, J. and Trebi-Ollennu, A., “Towards a Predictive Model of Mobile Robot Reliability,” Technical Report, CMU-RI-TR-05-38, Carnegie Mellon University, Pittsburgh, PA (2005).Google Scholar
8.Aarabi, P., “Robust Multi-Source Sound Localization Using Temporal Power Fusion,” Proceedings of Sensor Fusion: Architectures, Algorithms, and Applications V (AeroSense'01), Orlando, Florida (2001).Google Scholar
9.Winfield, A. F. T. and Nembrini, J., “Safety in numbers: Fault tolerance in robot swarms,” Int. J. Model. Ident. Control 1 (1), 3037 (2006).CrossRefGoogle Scholar
10.Cao, Y. U., Fukunaga, A. S. and Kahng, A. B., “Cooperative mobile robots: Antecedents and directions,” Auton. Robots 4, 727 (1997).CrossRefGoogle Scholar
11.Mataric, M. J., Nilsson, M. and Simsarian, K. T., “Cooperative Multi-Robot Box-Pushing,” In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Pittsburgh, Pennsylvania USA (1995) pp. 556561.Google Scholar
12.Wightman, F. L. and Kistler, D. J., “The dominant role of low-frequency interaural time differences in sound localization,” J. Acoust. Soc. Am. 91 (3), 16481661 (1992).CrossRefGoogle ScholarPubMed
13.Gerkey, B. P. and Mataric, M. J., “Pusher-Watcher: An Approach to Fault-Tolerant Tightly Coupled Robot Coordination,” In: Proceedings of the International Conference on Robotics and Automation, Washington, D.C. (2002) pp. 464469.Google Scholar
14.Backman, J. and Karjalainen, M., “Modelling of Human Directional and Spatial Hearing Using Neural Networks,” InProceeding of the ICASSP-93, Vol. I, Minneapolis, MN (1993) pp. 125128.Google Scholar
15.Blauert, J., Spatial Hearing: The Psychophysics of Human Sound Localization (MIT Press, Cambridge, MA, 2001).Google Scholar
16.Birchfield, S. T. and Gangishetty, R., “Acoustic Localization by Interaural Level Difference,” In: Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), Philadelphia, PA, Vol. 4 (2005) pp. 11091112.Google Scholar
17.Cui, W., Cao, Z. and Wei, J., “Dual-microphone source location method in 2-D space,” In: Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) Vol. 4, Toulouse, France (2006) pp. 845848.Google Scholar
18.Raspaud, M., Viste, H. and Evangelista, G., “Binaural source localization by joint estimation of ILD and ITD,” Trans. Audio Speech Lang. Process. 18 (1), 6877 (2010).CrossRefGoogle Scholar
19.Parker, L. E., “Heterogeneous Multi-Robot Cooperation,” Ph.D. Thesis (Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology, 1994).Google Scholar
20.Wang, Z. and Kumar, V., “Object Closure and Manipulation by Multiple Cooperation Mobile Robots,” In: Proceedings of the International Conference on Robotics and Automation, Vol. 1, Washington, D.C. (2002) pp. 394399.Google Scholar
21.Pereira, G. A. S., Kumar, V., Spletzer, J. R., Taylor, C. J. and Campos, M. F. M., “Cooperative Transport of Planar Objects by Multiple Mobile Robots Using Object Closure,” In: 8th Symposia on Experimental Robotics, International Experimental Robotics VIII, Springer Tracts in Advanced Robotics Vol. 5, (2003) pp 275284.Google Scholar
22.Kovac, K., Zivkovic, I. and Dalbelo Basic, B., “Simulation of Multi-Robot Reinforcement Learning for Box-Pushing Problem,” In: Proceedings of the 12th IEEE Mediterranean Electrotechnical Conference (MELECON 2004) Vol. 2, Dubrovnik, Croatia (2004) pp. 603606.Google Scholar
23.Li, Y. and Chen, X., “Modeling and Simulation of a Swarm of Robots for Box-Pushing Task,” Proceedings of the 12th Mediterranean Conference on Control and Automation, Kusadasi, Aydin, Turkey (2004).Google Scholar
24.Wang, Y. and de, C. W. Silva, “Multi-Robot Box-Pushing: Single-Agent Q-Learning vs. Team Q-Learning,” In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China (2006) pp. 36943699.Google Scholar
25.Inoue, Y., Tohge, T. and Iba, H., “Cooperative transportation system for humanoid robots using simulation-based learning,” Appl. Soft Comput. 7 (1), 115125 (2007).CrossRefGoogle Scholar
26.Noreils, F. R., “Toward a robot architecture integrating cooperation between mobile robots: Application to indoor environment,” Int. J. Robot. Res. 12 (1), 7998 (1993).CrossRefGoogle Scholar
27.Dorigo, M. and Sahin, E., “Guest editorial. Special issue: Swarm robotics,” Auton. Robots 17 (2–3), 111113 (2004).CrossRefGoogle Scholar
28.Choi, J. S., Eoh, G., Lee, S., Kim, J. and Lee, B. H., “Collision Prevention System Using Velocity Obstacle and Artificial Potential Field,” Proceedings of the International Symposium on Intelligent Systems, Hachioji, Japan (2010).Google Scholar
29.Choi, J. S., Eoh, G., Kim, J., Yoon, Y., Park, J. and Lee, B. H., “Analytic Collision Anticipation Technology Considering Agents‘ Future Behavior,” In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei, Taiwan (2010) pp. 16561661.Google Scholar
30.Vermaak, J., Gangnet, M., Blake, A. and Pérez, P., “Sequential Monte Carlo Fusion of Sound and Vision for Speaker Tracking,” In: Proceedings of the IEEE International Conference on Computer Vision, Vol. I, Vancouver, Canada (2001) pp. 741746.Google Scholar