single-jc.php

JACIII Vol.16 No.2 pp. 305-312
doi: 10.20965/jaciii.2012.p0305
(2012)

Paper:

Design of Mutual Interaction Between a User and Smart Electric Wheelchair

Mihoko Niitsuma, Terumichi Ochi, Masahiro Yamaguchi, and Koki Iwamoto

Chuo University, 1-13-27 Kasuga, Bunkyo-ku, Tokyo 112-8551, Japan

Received:
September 15, 2011
Accepted:
November 15, 2011
Published:
March 20, 2012
Keywords:
smart wheelchair, human-robot interaction, human-robot cooperation, vibrotactile interface
Abstract
This paper presents interaction between a user and a smart electric wheelchair. We propose a personal mobility tool (PMT) that integrates autonomous mobile robot navigation technology with intuitive and cognitive interaction between a user and a smart wheelchair. An intuitive and noncontinuous input method is proposed to enable a user to specify the direction in which the wheelchair is to go. Using an acceleration sensor and pressure sensors, the user gives a direction to the PMT, then the PMT determines the goal on an environmental map based on the direction. An output interface is used to help the user interpret robot behavior through informative communication between the user and the PMT. In this paper, a vibrotactile seat interface is presented.
Cite this article as:
M. Niitsuma, T. Ochi, M. Yamaguchi, and K. Iwamoto, “Design of Mutual Interaction Between a User and Smart Electric Wheelchair,” J. Adv. Comput. Intell. Intell. Inform., Vol.16 No.2, pp. 305-312, 2012.
Data files:
References
  1. [1] Y. Matsumoto, T. Ino, and T. Ogasawara, “Development of IntelligentWheelchair System with Face and Gaze Based Interface,” Proc. of 10th IEEE Int.Workshop on Robot and Human, Communication, pp. 262-267, 2001.
  2. [2] S. Yokota, H. Hashimoto, Y. Ohyama, and J. She, “Electric, Wheelchair Controlled by Human Body Motion – Classification of Body Motion and Improvement of Control Method –,” J. of Robotics and Mechatronics, Vol.22, No.4, pp. 439-446, 2010.
  3. [3] T. Shibata, Y. Matumoto, M. Inaba, and H. Inoue, “On-the-spot, Rider-directed Action Instruction with the Personal Vision-based Mobile Robot Hyper Scooter,” J. of the Robotics Society of Japan, Vol.14, No.8, pp. 1138-1144, 1996 (in Japanese).
  4. [4] Y. Ichinose, M. Wakumoto, K. Honda, T. Azuma, and J. Sato, “Human Interface Using a Wireless Tongue-Palate Contact Pressure Sensor System and Its Application to the Control of an Electric Wheelchair,” The Trans. of the IEICE, Vol.J86-D-II, No.2, pp.364-367, 2003 (in Japanese).
  5. [5] I. Moon, M. Lee, J. Chu, and M. Mun, “Wearable EMG-based HCI for Electric-Powered Wheelchair Users with Motor Disabilities,” in Proc. of the 2005 IEEE Int. Conf. on Robotics and Automation, pp. 2649-2654, 2005.
  6. [6] M. Niitsuma, T. Ochi, M. Yamaguchi, and H. Hashimoto, “An Approach of Human – Smart Electric Wheelchair Interaction in Intelligent Space,” 2011 IEEE Workshop on Robotic Intelligence in Informationally Structured Space, pp. 119-124, 2011.
  7. [7] J.-H. Lee and H. Hashimoto, “Intelligent Space – concept and contents,” Advanced Robotics, Vol.16, No.3, pp. 265-280, 2002.
  8. [8] D. Brscic and H. Hashimoto, “Mobile robot as physical agent of Intelligent Space,” J. of computing and information technology, Vol.17, No.1, pp. 81-94, 2009.
  9. [9] T. Sasaki and H. Hashimoto, “Intelligent Space as a Platform for Human Observation,” Human Robot Interaction, I-Tech Education and Publishing, Chapter 17, pp. 309-324, 2007.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Nov. 04, 2024