Published online by Cambridge University Press: 30 September 2008
People detection and tracking are essential capabilities in order to achieve a natural human–robot interaction. A great portion of the research in that area has been focused on monocular techniques. However, the use of stereo vision for these purposes concentrates a great interest nowadays. This paper presents a multi-agent system that implements a basic set of perceptual-motor skills providing mobile robots with primitive interaction capabilities. The skills designed use stereo and ultrasound information to enable mobile robots to (i) detect an interested user who desires to interact with the robot, (ii) keep track of the user while they move in the environment without confusing them with other people, and (iii) follow the user along the environment avoiding obstacles in the way. The system presented has been evaluated in several real-life experiments achieving good results and real-time performance on modest computers.