JRM Vol.32 No.4 pp. 761-767
doi: 10.20965/jrm.2020.p0761

Development Report:

Indirect Control of an Autonomous Wheelchair Using SSVEP BCI

Danny Wee-Kiat Ng and Sing Yau Goh

Lee Kong Chian Faculty of Engineering and Science, Universiti Tunku Abdul Rahman
Jalan Sungai Long, Bandar Sungai Long, Kajang, Selangor 43000, Malaysia

February 20, 2020
June 12, 2020
August 20, 2020
brain computer interface, wheelchair, autonomous

Having the capability to control a wheelchair using brain signals would be a major benefit to patients suffering from motor disabling diseases. However, one major challenge such systems are facing is the amount of input needed over time by the patient for control. Such a navigation control system results in a significant mental burden for the patient. The objective of this study is to develop a BCI system that requires a low number of inputs from a subject to operate. We propose an autonomous wheelchair that uses steady-state visual evoked potential based brain computer interfaces to achieve the objective. A dual mode system was implemented in this study to allow the autonomous wheelchair to work in both unknown and known environments. Robot operating system is used as the middleware in this study for the development of the algorithm to operate the wheelchair. The mental task for the subject using this wheelchair is reduced by relegating the responsibility of navigation control from the subject to the navigation software.

SSVEP BCI autonomous wheelchair

SSVEP BCI autonomous wheelchair

Cite this article as:
D. Ng and S. Goh, “Indirect Control of an Autonomous Wheelchair Using SSVEP BCI,” J. Robot. Mechatron., Vol.32 No.4, pp. 761-767, 2020.
Data files:
  1. [1] J. R. Wolpaw, N. Birbaumer, D. J. McFarland, G. Pfurtscheller, and T. M. Vaughan, “Brain-computer interfaces for communication and control,” Clinical Neurophysiology, Vol.113, No.6, pp. 767-791, 2002.
  2. [2] T. Ito, S. Ushii, T. Sameshima, Y. Mitsui, S. Ohgi, and C. Mizuike, “Design of brain-machine interface using near-infrared spectroscopy,” J. Robot. Mechatron., Vol.25, No.6, pp. 1000-1010, 2013.
  3. [3] H. Touyama and M. Sakuda, “Online control of a virtual object with collaborative ssvep,” J. Adv. Comput. Intell. Intell. Inform., Vol.21, No.7, pp. 1291-1297, 2017.
  4. [4] Y. Yu, Z. Zhou, J. Jiang, E. Yin, K. Liu, J. Wang, Y. Liu, and D. Hu, “Toward a hybrid bci: Self-paced operation of a p300-based speller by merging a motor imagery-based “brain switch” into a p300 spelling approach,” Int. J. of Human-Computer Interaction, Vol.33, No.8, pp. 623-632, 2017.
  5. [5] J. R. Wolpaw, R. S. Bedlack, D. J. Reda, R. J. Ringer, P. G. Banks, T. M. Vaughan, S. M. Heckman, L. M. McCane, C. S. Carmack, S. Winden et al., “Independent home use of a brain-computer interface by people with amyotrophic lateral sclerosis,” Neurology, Vol.91, No.3, pp. e258-e267, 2018.
  6. [6] R. Leeb, L. Tonin, M. Rohm, L. Desideri, T. Carlson, and J. d. R. Millán, “Towards independence: a bci telepresence robot for people with severe motor disabilities,” Proc. of the IEEE, Vol.103, No.6, pp. 969-982, 2015.
  7. [7] Z. Zhang, W. Wang, P. Song, S. Sheng, L. Xie, F. Duan, Y. GuanSoo, and M. Odagaki, “Design of an ssvep-based bci system with vision assisted navigation module for the cooperative control of multiple robots,” 2017 IEEE 7th Annual Int. Conf. on CYBER Technology in Automation, Control, and Intelligent Systems (CYBER), pp. 558-563, 2017.
  8. [8] W. Deng, I. Papavasileiou, Z. Qiao, W. Zhang, K.-Y. Lam, and H. Song, “Advances in automation technologies for lower-extremity neurorehabilitation: A review and future challenges,” IEEE Reviews in Biomedical Engineering, Vol.11, pp. 289-305, 2018.
  9. [9] M. A. Bockbrader, G. Francisco, R. Lee, J. Olson, R. Solinsky, and M. L. Boninger, “Brain computer interfaces in rehabilitation medicine,” PM&R, Vol.10, No.9, pp. S233-S243, 2018.
  10. [10] A. Kübler, S. Winter, A. C. Ludolph, M. Hautzinger, and N. Birbaumer, “Severity of depressive symptoms and quality of life in patients with amyotrophic lateral sclerosis,” Neurorehabilitation and Neural Repair, Vol.19, No.3, pp. 182-193, 2005.
  11. [11] A. Fernández-Rodríguez, F. Velasco-Álvarez, and R. Ron-Angevin, “Review of real brain-controlled wheelchairs,” J. of Neural Engineering, Vol.13, No.6, 061001, 2016.
  12. [12] H. Wang and A. Bezerianos, “Brain-controlled wheelchair controlled by sustained and brief motor imagery bcis,” Electronics Letters, Vol.53, No.17, pp. 1178-1180, 2017.
  13. [13] S. He, R. Zhang, Q. Wang, Y. Chen, T. Yang, Z. Feng, Y. Zhang, M. Shao, and Y. Li, “A p300-based threshold-free brain switch and its application in wheelchair control,” IEEE Trans. on Neural Systems and Rehabilitation Engineering, Vol.25, No.6, pp. 715-725, 2017.
  14. [14] S. M. T. Müller, T. F. Bastos, and M. S. Filho, “Proposal of a ssvep-bci to command a robotic wheelchair,” J. of Control, Automation and Electrical Systems, Vol.24, Nos.1-2, pp. 97-105, 2013.
  15. [15] M. Wang, I. Daly, B. Z. Allison, J. Jin, Y. Zhang, L. Chen, and X. Wang, “A new hybrid bci paradigm based on p300 and ssvep,” J. of Neuroscience Methods, Vol.244, pp. 16-25, 2015.
  16. [16] Z. Li, Y. Xiong, and L. Zhou, “Ros-based indoor autonomous exploration and navigation wheelchair,” Proc. 10th Int. Symp. Computational Intelligence and Design (ISCID), Vol.2, pp. 132-135, 2017.
  17. [17] T. Tsubouchi, “Introduction to simultaneous localization and mapping,” J. Robot. Mechatron., Vol.31, No.3, pp. 367-374, 2019.
  18. [18] S. Ohkawa, Y. Takita, H. Date, and K. Kobayashi, “Development of autonomous mobile robot using articulated steering vehicle and lateral guiding method,” J. Robot. Mechatron., Vol.27, No.4, pp. 337-345, 2015.
  19. [19] D. W. Ng, Y. Soh, and S. Goh, “Development of an autonomous bci wheelchair,” Proc. IEEE Symp. Computational Intelligence in Brain Computer Interfaces (CIBCI), pp. 1-4, 2014.
  20. [20] J. Xie, G. Xu, J. Wang, M. Li, C. Han, and Y. Jia, “Effects of mental load and fatigue on steady-state evoked potential based brain computer interface tasks: a comparison of periodic flickering and motion-reversal based visual attention,” PloS One, Vol.11, No.9, e0163426, 2016.
  21. [21] S.-P. Seo, M.-H. Lee, J. Williamson, and S.-W. Lee, “Changes in fatigue and eeg amplitude during a longtime use of brain-computer interface,” 2019 7th Int. Winter Conf. on Brain-Computer Interface (BCI), pp. 1-3, 2019.
  22. [22] T. Tsumugiwa, Y. Takeuchi, and R. Yokogawa, “Maneuverability of impedance-controlled motion in a human-robot cooperative task system,” J. Robot. Mechatron., Vol.29, No.4, pp. 746-756, 2017.
  23. [23] Texas Instruments, “ADS1299-x Low-Noise, 4-, 6-, 8-Channel, 24-Bit, Analog-to-Digital Converter for EEG and Biopotential Measurements,” January 2017.
  24. [24] M. Cheng, X. Gao, S. Gao, and D. Xu, “Design and implementation of a brain-computer interface with high transfer rates,” IEEE Trans. on Biomedical Engineering, Vol.49, No.10, pp. 1181-1186, 2002.
  25. [25] W. Hess, D. Kohler, H. Rapp, and D. Andor, “Real-time loop closure in 2d lidar slam,” 2016 IEEE Int. Conf. on Robotics and Automation (ICRA), pp. 1271-1278, 2016.
  26. [26] C. Rösmann, F. Hoffmann, and T. Bertram, “Integrated online trajectory planning and optimization in distinctive topologies,” Robotics and Autonomous Systems, Vol.88, pp. 142-153, 2017.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Jul. 12, 2024