single-rb.php

JRM Vol.30 No.1 pp. 86-92
doi: 10.20965/jrm.2018.p0086
(2018)

Paper:

Structure and Examination of the Guidance Robot LIGHBOT for Visually Impaired and Elderly People

Kazuteru Tobita, Katsuyuki Sagayama, Mayuko Mori, and Ayako Tabuchi

NSK Ltd.
1-5-50 Kugenuma-shinmei, Fujisawa-shi, Kanagawa 251-8501, Japan

Received:
August 29, 2017
Accepted:
October 30, 2017
Published:
February 20, 2018
Keywords:
guidance robot, visually impaired, navigation, semi-autonomous robot
Abstract

We developed a robot that can guide visually impaired people and elderly people as they walk around in large hospitals. In relation to this, a previous report described the structure of a guidance robot and the comparison of its use with that of a white cane. It was shown that with the use of the robot, participants could move more easily, safely, and confidently than with a white cane. However, to solve the problems encountered with the use of the previous robot, a new guidance robot was fabricated. This paper describes the structure of the new robot and the results of the demonstration examination of the robot in the Kanagawa Rehabilitation Center. The robot navigates to reach a destination set by using the touch panel, whereas velocity depends on the force exerted by the user on the robot. The questionnaire answered by the participants were evaluated using the system usability scale, which showed that the acceptability range of the robot is “Acceptable” and its usability is high.

Situation of the examination by the visually impaired

Situation of the examination by the visually impaired

Cite this article as:
K. Tobita, K. Sagayama, M. Mori, and A. Tabuchi, “Structure and Examination of the Guidance Robot LIGHBOT for Visually Impaired and Elderly People,” J. Robot. Mechatron., Vol.30 No.1, pp. 86-92, 2018.
Data files:
References
  1. [1] “The results of actual survey of children with physical disabilities,” The Ministry of Health, Labor and Welfare, 2006.
  2. [2] “Visual impairment and blindness,” World Health Organization, October 2014.
  3. [3] “Rehabilitation manual 13, Tactile ground surface indicators for blind persons,” National Rehabilitation Center for Persons with Disabilities, December 2003.
  4. [4] S. Tachi and K. Komoriya, “Guide Dog Robot” Robotics Research: The Second Int. Symposium 1984, The MIT Press, pp. 333-340, 1985.
  5. [5] S. Kotani, H. Mori, and N. Kiyohiro, “Development of the robotic travel aid HITOMI,” J. Robot. Autom. Syst., Vol.17, pp. 119-128, 1996.
  6. [6] I. Ulrich and J. Borenstein, “The GuideCane – Applying Mobile Robot Technologies to Assist the Visually Impaired,” IEEE Trans. on Systems, Man, and Cybernetics, Part A: Systems and Humans, Vol.31, No.2, pp. 131-136, March 2001.
  7. [7] A. Kulkarni, A. Wang, L. Urbina, A. Steinfeld, and B. Dias, “Robotic Assistance in Indoor Navigation for People Who are Blind,” 11th ACM/IEEE Int. Conf. on Human-Robot Interaction (HRI), March 7-10, 2016.
  8. [8] J. Cai and T. Matsumaru, “Human Detecting and Following Mobile Robot Using a Laser Range Sensor,” J. of Robotics and Mechatronics, Vol.26, No.6, pp. 718-734, 2014.
  9. [9] K. Tobita, H. Ogawa, and K. Sagayama, “Development of a Robot Substitute for a Guide Dog,” NSK Technical J. Motion & Control, No.25, pp. 19-24, September 2016.
  10. [10] K. Tobita and K. Sagayama, “Effort of Guidance Robot for the Visually Impaired,” Letters of the Japan Robot Association, No.217, pp. 33-36, March 2014.
  11. [11] K. Tobita, K. Sagayama, and H. Ogawa, “Examination of a guidance robot for visually impaired people,” J. of Robotics and Mechatronics, Vol.29, No.4, pp. 720-727, 2017.
  12. [12] “Survey of strength and exercise capacity,” Ministry of Education, Culture, Sports, Science and Technology, 2014.
  13. [13] E. W. Dijkstra, “A note on two problems in connexion with graphs,” Numerische Mathematik, Vol.1, pp. 269-271, 1959.
  14. [14] F. Dellaert, D. Fox, W. Burgard, and S. Thrun, “Monte Carlo localization for mobile robots,” Proc. of the IEEE Int. Conf. on Robotics and Automation, Vol.2, 1999.
  15. [15] D. Fox, W. Burgard, and S. Thrun, “The dynamic window approach to collision avoidance,” IEEE Robotics & Automation Magazine, Vol.4, No.1, pp. 23-33, 1997.
  16. [16] J. Brooke, “SUS: A Retrospective,” J. of Usability Studies, Vol.8, Issue 2, pp. 29-40, 2013.
  17. [17] S. Yamauchi, “Introduction of research plan targeting human for engineer,” Maruzen Publishing, 2015.
  18. [18] A. Bangor, P. Kortum, and J. Miller, “Determining What Individual SUS Score Mean: Adding an Adjective Rating Scale,” J. of Usability Studies, Vol.4, Issue 3, pp. 114-123, 2009.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Dec. 06, 2024