single-rb.php

JRM Vol.23 No.1 pp. 66-74
doi: 10.20965/jrm.2011.p0066
(2011)

Paper:

A Prototype of ElectricWheelchair Controlled by Eye-Only for Paralyzed User

Kohei Arai* and Ronny Mardiyanto*,**

*Saga University, 1 Honjo, Saga 840-8502, Japan

**Institut Teknologi Sepuluh Nopember, Keputih, Sukolilo, Surabaya, Indonesia

Received:
March 22, 2010
Accepted:
June 25, 2010
Published:
February 20, 2011
Keywords:
wheelchair, eye gaze, paralysis, computer input by eye-only, hand-free controller
Abstract
The numbers of persons who are paralyzed dependent on others due to loss of self-mobility is growing as the population ages. We have developed a wheelchair prototype exclusively controlled by eye and able to be used different users while proving robust against vibration, illumination change, and user movement. The keys to this flexibility are the camera mounted on the user’s glasses and the use of pupil detection. Image processing analyzes the user’s gaze for wheelchair control. Result of pupil detection method compared to others showed that our method is superior. Also, result of camera placement compared to other systems regarding vibration influence showed that our camera placement reduces vibration almost completely. Moreover, influence of illumination change has been evaluated. Experiments involving five different users in the wheelchair along a 9.73-meter track recorded an average travel time of 85.8 second. Demonstrating the feasibility and reliability of our proposal providing computer input for paralyzed user is to use in controlling wheelchair.
Cite this article as:
K. Arai and R. Mardiyanto, “A Prototype of ElectricWheelchair Controlled by Eye-Only for Paralyzed User,” J. Robot. Mechatron., Vol.23 No.1, pp. 66-74, 2011.
Data files:
References
  1. [1] R. C. Simpson, “Smart wheelchairs: A literature review,” J. of Rehabilitation Research & Development, Vol.42, No.4, pp. 423-436, 2005.
  2. [2] L. A. Cordon, “Popular Psychology: An Encyclopedia,” Greenwood Press, p. 8, 2005.
  3. [3] S. M. White and S. Hawking, “A Life in Science,” 1992.
  4. [4] R. Barea, L. Boquete, M. Mazo, and E. Lopez, “System for Assisted Mobility using Eye Movements based on Electrooculography,” IEEE Trans. on Neural System and Rehabilitation Engineering, Vol.10, No.4, pp. 209-218, 2002.
  5. [5] A. Ferreira, R. L. Silva, W. C. Celeste, T. F. Bastos, and M. Sarcinelli, “Human-Machine Interface Based on Muscular and Brain Signals applied to a Robotic Wheelchair,” IOP Publishing Ltd., Vol.7, pp. 1541-1672, 2007.
  6. [6] M. Mazo, F. J. Rodríguez, J. L. Lázaro, J. Ureña, J. C. García, E. Santiso, and P. A. Revenga, “Electronic control of a wheelchair guided by voice commands,” Elsevier of Control Engineering Practice, Vol.3, pp. 665-674, 1995.
  7. [7] P. Jia and H. Hu, “Head Gesture based control of an Intelligent Wheelchair,” The 11th Annual Conf. of the Chinese Automation and Computing Society in the UK (CACSUK05), pp. 85-90, 2005.
  8. [8] D. Purwanto, R.Mardiyanto, and K. Arai, “Electric wheelchair control with gaze direction and eye blinking,” Proc. of The Fourteenth Int. symposium on Artificial Life and Robotics, Vol.GS21, No.5, B-ConPlaza, Beppu, 2008.
  9. [9] K. Kawai, S. Hiramatu, Z. Luo, and A. Kato, “Construction of intellectual control system of autonomous, electric wheelchair,” Bulletin of Aichi Institute of Technology, Part B, Vol.39, pp. 21-27, 2004.
  10. [10] Y. Matsumoto, T. Ino, and T. Ogasawara, “Development of Intelligent Wheelchair System with Face and Gaze Based Interface,” Papers of Technical Meeting on Systems and Control, IEE Japan, Vol.SC-01 No.20-30, pp. 59-64, 2001.
  11. [11] T. Shibusawa, Y. Kobayashi, and Y. Kuno, “Robotic Wheelchair for supporting Art Appreciation,” Proc. of the Annual General Assembly of the Information and Systems, The Institute of Electronics, Information and Communication Engineer 2008, Vol.2, pp. 169-169, 2008.
  12. [12] T. Iwase, R. Zhang, and Y. Kuno, “Robotic Wheelchair Moving with the Caregiver,” SICE, pp. 238-243, 2006 SICE-ICASE Int. Joint Conf., 2006.
  13. [13] T. Iwase, A. Nakamura, and Y. Kuno, “Robotic Wheelchair Understanding the User’s Intention in Speech Using the Environmental Information,” Proc. of the IASTED Int. Conf. on Advances in Computer Science and Technology (ACST 2004), pp. 285-290, 2004.
  14. [14] Y. Kuno, T. Yoshimura, M. Mitani, and A. Nakamura, “Robotic wheelchair looking at all people with multiple sensors,” Proc. of IEEE Int. Conf. on Multisensor Fusion and Integration for Intelligent Systems, MFI2003, pp. 341-346, 2003.
  15. [15] G. Bradski and A. Kaebler, “Learning Computer Vision with the OpenCV Library,” O’REILLY, pp, 214-219, 2008.
  16. [16] Z. Zhu, Q. Ji, K. Fujimura, and K. Lee, “Combining Kalman filtering and mean shift for real time eye tracking under active IR illumination,” Proc. of 16th Pattern Recognition Int. Conf., Vol.4, pp. 318-321, 2002.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Nov. 04, 2024