JRM Vol.35 No.3 pp. 743-750
doi: 10.20965/jrm.2023.p0743


Development of a Gaze-Driven Electric Wheelchair with 360° Camera and Novel Gaze Interface

Junji Kawata* ORCID Icon, Jiro Morimoto* ORCID Icon, Yoshio Kaji** ORCID Icon, Mineo Higuchi* ORCID Icon, and Shoichiro Fujisawa* ORCID Icon

*Faculty of Science and Engineering, Tokushima Bunri University
1314-1 Shido, Sanuki, Kagawa 769-2193, Japan

**Faculty of Human Life Sciences, Tokushima Bunri University
180 Nishihama, Yamashiro, Tokushima 770-8514, Japan

December 10, 2022
April 4, 2023
June 20, 2023
electric wheelchair, eye-tracking device, gaze-based user interface, 360° camera, robot operating system (ROS)

A novel gaze-based user interface (UI) is proposed for the remote control of robots and electric wheelchairs. A task-based experiment showed that this UI is more suitable for remote control than a conventional interface. The UI was applied to the control of a commercially available low-cost electric wheelchair. By using a 360° camera and an eye tracker that can be used outdoors, the visibility of obstacles and the usability of the gaze-driven electric wheelchair were greatly improved. The gaze-driven electric wheelchair exhibited good performance in a task-based evaluation experiment.

A gaze-driven electric wheelchair

A gaze-driven electric wheelchair

Cite this article as:
J. Kawata, J. Morimoto, Y. Kaji, M. Higuchi, and S. Fujisawa, “Development of a Gaze-Driven Electric Wheelchair with 360° Camera and Novel Gaze Interface,” J. Robot. Mechatron., Vol.35 No.3, pp. 743-750, 2023.
Data files:
  1. [1] K. Arai and R. Mardiyanto, “A Prototype of Electric Wheelchair Controlled by Eye-Only for Paralyzed User,” J. Robot. Mechatron., Vol.23, No.1, pp. 66-74, 2011.
  2. [2] M. A. Eid, N. Giakoumidis, and A. E. Saddik, “A Novel Eye-Gaze-Controlled Wheelchair System for Navigating Unknown Environments: Case Study With a Person With ALS,” IEEE Access, Vol.4, pp. 558-573, 2016.
  3. [3] Y. K. Meena, H. Cecotti, K. Wong-Lin, and G. Prasad, “A multimodal interface to resolve the midas-touch problem in gaze controlled wheelchair,” 39th Annual Int. Conf. of the IEEE Engineering in Medicine and Biology Society (EMBC), pp. 905-908, 2017.
  4. [4] C. Singer and B. Hartmann, “See-thru: Towards minimally obstructive eye-controlled wheelchair interfaces,” Proc. of the 21st Int. ACM SIGACCESS Conf. on Computers and Accessibility (ASSET 2019), pp. 459-469, 2019.
  5. [5] S. Mahendran, N. Songur, D. Adjei, P. Orlov, and A. A. Faisal, “A.Eye Drive: Gaze-based semi-autonomous wheelchair interface,” 41st Annual Int. Conf. of the IEEE Engineering in Medicine and Biology Society (EMBC), pp. 5967-5970, 2019.
  6. [6] L. Maule, A. Luchetti, M. Zanetti, P. Tomasin, M. Pertile, M. Tavernini, G. M. A. Guandalini, and M. D. Cecco, “RoboEYE, an efficient, reliable and safe semi-autonomous gaze driven wheelchair for domestic use,” Technologies, Vol.9, No.1, Article No.16, 2021.
  7. [7] M. S. H. Sunny, M. I. I. Zarif, I. Rulik, J. Sanjuan, M. H. Rahman, S. I. Ahamed, I. Wang, K. Schultz, and B. Brahmi, “Eye-gaze control of a wheelchair mounted 6DOF assistive robot for activities of daily living,” J. NeuroEngineering Rehabilitation, Vol.18, Article No.173, 2021.
  8. [8] H. O. Latif, N. Sherkat, and A. Lotfi, “Teleoperation through eye gaze (TeleGaze): A multimodal approach,” 2009 IEEE Int. Conf. on Robotics and Biomimetics (ROBIO), pp. 711-716, 2009.
  9. [9] J. Gomes, F. Marques, A. Lourenço, R. Mendonça, P. Santana, and J. Barata, “Gaze-directed telemetry in high latency wireless communications: the case of robot teleoperation,” 42nd Annual Conf. of the IEEE Industrial Electronics Society (IECON 2016), pp. 704-709, 2016.
  10. [10] S. Dziemian, W. W. Abbott, and A. A. Faisal, “Gaze-based teleprosthetic enables intuitive continuous control of complex robot arm use: Writing & drawing,” IEEE Int. Conf. on BioRob, pp. 1277-1282, 2016.
  11. [11] C. Carreto, D. Gêgo, and L. Figueiredo, “An eye-gaze tracking system for teleoperation of a mobile robot,” J. of Information Systems Engineering and Management, Vol.3, No.2, Article No.16, 2018.
  12. [12] I. Poy, L. Wu, and B. E. Shi, “A Multimodal Direct Gaze Interface for Wheelchairs and Teleoperated Robots,” 43rd Annual Int. Conf. of Engineering in Medicine and Biology Society (EMBC), pp. 4796-4800, 2021.
  13. [13] A. Kogawa, M. Onda, and Y. Kai, “Development of a Remote-Controlled Drone System by Using Only Eye Movements: Design of a Control Screen Considering Operability and Microsaccades,” J. Robot. Mechatron., Vol.33, No.2, pp. 301-312, 2021.
  14. [14] M. Nakazawa, K. Takahashi, and T. Abe, “Implementation and it’s evaluation of autonomous moving wheelchair system using EEG,” IPSJ SIG Technical Reports, Vol.2015-DPS-163, No.14, 2015 (in Japanese).
  15. [15] C. Ishii and R. Konishi, “A Control of Electric Wheelchair Using an EMG Based on Degree of Muscular Activity,” 2016 Euromicro Conf. on DSD, pp. 567-574, 2016.
  16. [16] J. Kawata et al., “An improvement of eye-tracking interface for robotic systems,” SICE SI2019, pp. 783-784, 2019 (in Japanese).
  17. [17] J. Kawata et al., “Gaze control of powered wheelchair,” IEEJ Annual Conf. on Electronics Information and Systems (IEEJ EIS 2021), OS5-2, 2021 (in Japanese).
  18. [18] J. Kawata et al., “Gaze control of powered wheelchair with equirectangular images,” IEEJ Annual Conf. on Electronics Information and Systems (IEEJ EIS 2022), TC6-3, 2022 (in Japanese).
  19. [19] I. Fatt and B. A. Weissman, “Physiology of the eye: an introduction to the vegetative functions,” Butterworth-Heinemann, 2013.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Sep. 29, 2023