Hostname: page-component-78c5997874-lj6df Total loading time: 0 Render date: 2024-11-19T16:36:03.934Z Has data issue: false hasContentIssue false

Mobile robot navigation with a self-paced brain–computer interface based on high-frequency SSVEP

Published online by Cambridge University Press:  27 November 2013

Pablo F. Diez*
Affiliation:
Gabinete de Tecnología Médica (GATEME), Facultad de Ingeniería, Universidad Nacional de San Juan (UNSJ), San Juan, Argentina Instituto de Automática (INAUT), Facultad de Ingeniería, Universidad Nacional de San Juan (UNSJ), San Juan, Argentina
Vicente A. Mut
Affiliation:
Instituto de Automática (INAUT), Facultad de Ingeniería, Universidad Nacional de San Juan (UNSJ), San Juan, Argentina
Eric Laciar
Affiliation:
Gabinete de Tecnología Médica (GATEME), Facultad de Ingeniería, Universidad Nacional de San Juan (UNSJ), San Juan, Argentina
Enrique M. Avila Perona
Affiliation:
Instituto de Automática (INAUT), Facultad de Ingeniería, Universidad Nacional de San Juan (UNSJ), San Juan, Argentina
*
*Corresponding author. E-mail: [email protected]

Summary

A brain–computer interface (BCI) is a system for commanding a device by means of brain signals without having to move any muscle. One kind of BCI is based on Steady-State Visual Evoked Potentials (SSVEP), which are evoked visual cortex responses elicited by a twinkling light source. Stimuli can produce visual fatigue; however, it has been well established that high-frequency SSVEP (>30 Hz) does not. In this paper, a mobile robot is remotely navigated into an office environment by means of an asynchronous high-frequency SSVEP-based BCI along with the image of a video camera. This BCI uses only three electroencephalographic channels and a simple processing signal method. The robot velocity control and the avoidance obstacle algorithms are also herein described. Seven volunteers were able to drive the mobile robot towards two different places. They had to evade desks and shelves, pass through a doorway and navigate in a corridor. The system was designed so as to allow the subject to move about without restrictions, since he/she had full robot movement's control. It was concluded that the developed system allows for remote mobile robot navigation in real indoor environments using brain signals. The proposed system is easy to use and does not require any special training. The user's visual fatigue is reduced because high-frequency stimulation is employed and, furthermore, the user gazes at the stimulus only when a command must be sent to the robot.

Type
Articles
Copyright
Copyright © Cambridge University Press 2013 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1.Mason, S., Kronegg, J., Huggins, J., Fatourechi, M. and Schlögl, A., “Evaluating the Performance of Self-Paced Brain Computer Interface Technology,” Technical Report (2006). Available at: http://www.bci-info.tugraz.at/Research_Info/documents/articles (accessed March, 2011).Google Scholar
2.Scherer, R., Lee, F., Schlögl, A., Leeb, R., Bischof, H. and Pfurtscheller, G., “Toward self-paced brain–computer communication: Navigation through virtual worlds,” IEEE Trans. Biomed. Eng. 55 (2), 675682 (2008).Google Scholar
3.Millán, J. del R., Renkens, F., Mouriño, J. and Gerstner, W., “Noninvasive brain-actuated control of a mobile robot by human EEG,” IEEE Trans. Biomed. Eng. 51 (6), 10261033 (2004).CrossRefGoogle ScholarPubMed
4.Geng, T., Dyson, M., Tsui, C. S. L. and Gan, J. Q., “A 3-Class Asynchronous BCI Controlling a Simulated Mobile Robot,” Proceedings of the 29th Annual International Conference of the IEEE EMBS, Lyon, France (Aug. 22–26, 2007) pp. 25242527.Google Scholar
5.Bayliss, J. D. and Ballard, D. H., “Single trial P3 epoch recognition in a virtual environment,” Neurocomputing 32–33, 637642 (2000).Google Scholar
6.Hinic, V., Petriul, E. M. and Whalen, T. E., “Human-Computer Symbiotic Cooperation in Robot-Sensor Networks,” Proceedings of Instrumentation and Measurement Technology Conference (IMTC'07), Warsaw, Poland (May 1–3, 2007) pp. 16.Google Scholar
7.Leeb, R., Friedman, D., Muller-Putz, G. R., Scherer, R., Slater, M. and Pfurtscheller, G., “Self-paced (asynchronous) BCI control of a wheelchair in virtual environments: A case study with a tetraplegic,” Comput. Intell. Neurosci. Article ID 79642 (2007).Google Scholar
8.Galan, F., Nuttin, F., Lew, E., Ferrez, P. W., Vanacker, G., Philips, J. and Millán, J. del R., “A brain-actuated wheelchair: Asynchronous and non-invasive brain–computer interfaces for continuous control of robots,” Clin. Neurophysiol. 119, 21592169 (2008).Google Scholar
9.Millán, J. del R., Galan, F., Vanhooydonck, D., Lew, E., Philips, J. and Nuttin, M., “Asynchronous non-invasive brain-actuated control of an intelligent wheelchairProceedings of 31st Annual International Conference of the IEEE EMBS, Minneapolis, MN, USA (Sep. 3–6, 2009) pp. 33613364.Google Scholar
10.Lopes, A. C., Nunes, U., Vaz, L., “Assisted Navigation Based on Shared-Control Using Discrete and Sparse Human-Machine Interfaces,” Proceedings of 32nd Annual International Conference of the IEEE EMBS, Buenos Aires, Argentina (Aug. 31–Sep. 4, 2010) pp. 471474.Google Scholar
11.Bell, C. J., Shenoy, P., Chalodhorn, R. and Rao, R. P. N., “Control of a humanoid robot by a noninvasive brain–computer interface in humans,” J. Neural Eng. 5, 214220 (2008).Google Scholar
12.Scherer, R., Chung, M., Lyon, J., Cheung, W. and Rao, R. P. N., “Interaction with Virtual and Augmented Reality Environments Using Non-Invasive Brain-Computer Interfacing,” Proceedings of ICABB, Venice, Italy (Oct. 14–16, 2010).Google Scholar
13.Ron-Angevin, R., Velasco-Alvarez, F., Sancha-Ros, S. and da Silva-Sauer, L., “A Two-Class Self-Paced BCI to Control a Robot in Four Directions,” Proceedings of IEEE International Conference on Rehabilitation Robotics Rehab, Zurich, Switzerland (Jun. 29–Jul. 1, 2011) pp. 16.Google Scholar
14.Iturrate, I., Antelis, J., Minguez, J. and Kübler, A., “A non-invasive brain-actuated wheelchair based on a P300 neurophysiological protocol and automated navigation,” IEEE Trans. Robot. 25 (3), 614627 (2009).Google Scholar
15.Escolano, C. and Minguez, J., “Sistema de teleoperación multi-robot basado en interfaz cerebro-computador,” Revista Iberoamericana de Automática e Informática Industrial (RIAII) 8 (2), 1623 (2011).Google Scholar
16.Escolano, C., Antelis, J. M. and Minguez, J., “A telepresence mobile robot controlled with a noninvasive brain–computer interface,” IEEE Trans. Syst. Man Cybern. 42 (3), 793804 (2012).Google Scholar
17.Regan, D., Human Brain Electrophysiology: Evoked Potentials and Evoked Magnetic Fields in Science and Medicine (Elsevier, New York, 1989).Google Scholar
18.Herrmann, C. S., “Human EEG responses to 1–100 Hz flicker: Resonance phenomena in visual cortex and their potential correlation to cognitive phenomena,” Exp. Brain. Res. 137, 346353 (2001).Google Scholar
19.Wang, Y., Wang, R., Gao, X., Hong, B. and Gao, S., “A practical VEP-based brain–computer interface,” IEEE Trans. Neural Syst. Rehab. Eng. 14 (2), 234239 (2006).CrossRefGoogle ScholarPubMed
20.Molina, G. Garcia and Mihajlovic, V., “Spatial filters to detect steady-state visual evoked potentials elicited by high frequency stimulation: BCI application,” Biomed. Tech. 55, 173182 (2010).Google Scholar
21.Materka, A., Byczuk, M. and Poryzala, P., “A Virtual Keypad Based on Alternate Half-Field Stimulated Visual Evoked Potentials,” Proceedings of International Symposium on Information Technology Convergence, Joenju, South Korea (Nov. 23–24, 2007) pp. 296300.Google Scholar
22.Wang, Y., Wang, R., Gao, X. and Gao, S., “Brain-Computer Interface Based on the High-Frequency Steady-State Visual Evoked Potential,” Proceedings of 1st International Conference on Neural Interface and Control, Wuhan, China (May 26–28, 2005) pp. 3739.Google Scholar
23.Fisher, R. S., Harding, G., Erba, G., Barkley, G. L. and Wilkins, A., “Photic and pattern induced seizures: A review for the Epilepsy Foundation of America Working Group,” Epilepsia 46 (9), 14261441 (2005).Google Scholar
24.Müller, S. M. Torres, Bastos-Filho, T. Freire, Celeste, W. Cardoso and Sarcinelli-Filho, M., “Brain-computer interface based on visual evoked potentials to command autonomous robotic wheelchair,” J. Med. Biol. Eng. 30 (6), 407416 (2010).Google Scholar
25.Mandel, C., Lüth, T., Laue, T., Röfer, T., Gräser, A. and Krieg-Brückner, B., “Navigating a Smart Wheelchair with a Brain-Computer Interface Interpreting Steady-State Visual Evoked Potentials,” Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, St. Louis, MO, USA (Oct. 10–15, 2009) pp. 11181125.Google Scholar
26.Prueckl, R. and Guger, C., “A Brain–Computer Interface Based on Steady State Visual Evoked Potentials for Controlling a Robot,” Proceedings of the 10th International Work-Conference on Artificial Neural Networks (IWANN '09), Salamanca, Spain (Jun. 10–12, 2009).Google Scholar
27.Müller-Putz, G. R. and Pfurtscheller, G., “Control of an electrical prosthesis with an SSVEP-based BCI,” IEEE Trans. Biomed. Eng. 55 (1), 361364 (2008).Google Scholar
28.Grigorescu, S. M., Lüth, T., Fragkopoulos, C., Cyriacks, M. and Gräser, A., “A BCI controlled robotic assistant for quadriplegic people in domestic and professional life,” Robotica 30 (3), 419431 (2012). doi:10.1017/S0263574711000737.Google Scholar
29.Diez, P. F., Mut, V. A., Perona, E. Avila and Leber, E. Laciar, “Asynchronous BCI control using high-frequency SSVEP,” J. Neuro. Eng. Rehab. 8 (39), 18 (2011).Google Scholar
30.Volosyak, I., Valbuena, D., Lüth, T., Malechka, T. and Gräser, A., “BCI demographics II: How many (and what kinds of) people can use a high-frequency SSVEP BCI?IEEE Trans. Neural Syst. Rehab. Eng. 19 (3), 232239 (2011).Google Scholar
31.Carelli, R., Secchi, H. and Mut, V., “Algorithms for stable control of mobile robots with obstacle avoidance,” Latin Am. Appl. Res. 29 (3), 191196 (1999).Google Scholar
32.Secchi, H., Carelli, R. and Mut, V., “Discrete Stable Control of Mobile Robots with Obstacle Avoidance,” Proceedings of the International Conference on Advanced Robotics (ICAR'01), Budapest, Hungary (Aug. 22, 2001) pp. 405441.Google Scholar
33.Mut, V., Carelli, R. and Kuchen, B., “Adaptive impedance control for robot with sensorial feedback (in Spanish),” Proceedings of XIII National Symposium of Automatic Control (AADECA), Buenos Aires, Argentina (Oct. 1992) pp. 345349.Google Scholar
34.Borenstein, J. and Koren, Y., “The vector field histogram-fast obstacle avoidance for mobile robots,” IEEE Trans. Robot. Autom. 7 (3), 278288 (1991).Google Scholar
35.Carelli, R. and Freire, E. Oliveira, “Corridor navigation and wall-following stable control for SONAR-based mobile robots,” Robot. Auton. Syst. 45, 235247 (2003).Google Scholar
36.Diez, P. F., Mut, V., Laciar, E. and Avila, E., “A Comparison of Monopolar and Bipolar EEG Recordings for SSVEP Detection,” Proceedings of 32nd Annual International Conference of the IEEE EMBS, Buenos Aires, Argentina (Aug. 31–Sep. 4, 2010) pp. 58035806.Google Scholar
37.Mondada, F., Bonani, M., Raemy, X., Pugh, J., Cianci, C., Klaptocz, A., Magnenat, S., Zufferey, J., Floreano, D. and Martinoli, A., “The e-Puck, A Robot Designed for Education in Engineering,” Proceedings of the 9th Conference on Autonomous Robot Systems and Competitions, Castelo Branco, Portugal (May 7, 2009) pp. 5965.Google Scholar
38.Diez, P. F., Müller, S. M. Torres, Mut, V. A., Laciar, E., Avila, E., Bastos-Filho, T. Freire and Sarcinelli-Filho, M., “Commanding a robotic wheelchair with a high-frequency steady-state visual evoked potential based brain–computer interface,” Med. Eng. Phys. 35 (8), 11551164 (2013). Available at: http://dx.doi.org/10.1016/j.medengphy.2012.12.005Google Scholar