JRM Vol.21 No.4 pp. 443-452
doi: 10.20965/jrm.2009.p0443


A Service System Adapted to Changing Environments Using “Kukanchi”

Yusuke Fukusato*, Eri sato–Simokawara*, Toru Yamaguchi*, and Makoto Mizukawa**

* Faculty of System Design, Tokyo Metropolitan University,

** Faculty of Engineering, Shibaura Institute of Technology

December 26, 2008
April 15, 2009
August 20, 2009
Kukanchi, environment information, robot, human-robot interaction
Robots have been expected to coexist with people, providing them with suitable services. To provide services, robots need to recognize the environment, situation, context, and so on. Additionally, robots observe people’s movements in their daily life and ascertain the relations among their movements, the environment, and services. This information is inherent in the environment, and robots use it to generate services. The authors called the architecture &ldquo:Kukanchi: Interactive Human-Space Design and Intelligence.” This paper aims at the interaction between man and a robot system based on the Kukanchi space. Considering the situation, the robot system recognizes the services that users want. To implement the system, a vast amount of information is needed. The authors focused on the interface, which is divided into phases depending on the environment. This paper details two experiments using this system. The first one is a car-like robot service on a street, and the second one is a shopping assistant service in a store.
Cite this article as:
Y. Fukusato, E. sato–Simokawara, T. Yamaguchi, and M. Mizukawa, “A Service System Adapted to Changing Environments Using “Kukanchi”,” J. Robot. Mechatron., Vol.21 No.4, pp. 443-452, 2009.
Data files:
  1. [1] N. Ando, T. Suehiro, K. Kitagaki, T. Kotoku, and Woo-Keun Yoon, “RT-Middleware: Distributed Component Middleware for RT (Robot Technology),” 2005 IEEE/RSJ Int. Conf. on Intelligent Robotics and Systems, pp. 3555-3560, 2005.
  2. [2] “The total picture of the RT middleware,” MAINICHI COMMUNICATIONS, PLUS ROBOT, Vol.1, pp. 77-99, 2008.
  3. [3] K. Sekiyama, “Sharing ROI Process for Multi-Robot Cooperation,” 9th SICE System Integration Division Annual Conf. (SI2008) pp. 75-76, 2008.
  4. [4] M. Mizukawa and T. Yamaguchi, “Research on Functional Design of Ambient Intelligence,” 7th SICE System Integration Division Annual Conf. (SI2006), pp. 534-537, 2006.
  5. [5] T. Yamaguchi, E. Sato, and Y. Takama, “Intelligent Space and Human Centered Robotics,” IEEE Transaction on Industrial Electronics, Vol.50, No.5, 2003.
  6. [6] T. Yamaguchi, “Networked Intelligence and Ontology,” Soft Computing as Transdisciplinary Science and Technology, Springer, pp. 8-10, 2005
  7. [7] K. Nakamura. “Significance of Non-language communication,” GAKUJYUTUNODOUKOU, Vol.9, No.2, pp. 28-31, 2004.
  8. [8] L. Brosnahan, “Japanese and English Gesture: Contrastive Nonverbal Communication,” Taishukan, Tokyo, 1990.
  9. [9] G. Butterworth and N. Jarrett, “What minds have in common is space: Spatial mechanisms serving joint visual attention in infancy,” British Journal of Developmental Psychology, Vol.9, pp. 55-72, 1991.
  10. [10] E. Sato, T. Yamaguguchi, and F. Harashima, “Natural Interface Using Pointing Behavior for Human-Robot Gestural Interaction,” IEEE Transaction on Industrial Electronics, Vol.54, No.2, 2007.
  11. [11] H. Kishinami, K. Kiyama, T. Yamaguchi, and J. Nakazato, “Development of i-mobility,” TRAFST: 05PR0001, pp. 161-162, 2005.
  12. [12] J. Nakazato, T. Yamaguchi, E. Sato, “Humatronics Second Stage? Networked Intelligence Application for Robot Car System,” Advances in Computational Sciences and Technology, ISSN 0973-6107, Vol.1 No.1, pp. 23-34, 2007.
  13. [13] A. Nakamura, T. Yamaguchi, and E. S. Shimokawara, “Intelligent Network Mobility using Environment Information,” Joint 4th Int. Conf. on Soft Computing and Intelligent Systems and 9th Int. Symposium on advanced Intelligent Systems (SCIS & ISIS2008), Nagoya, Japan, pp. 317-322 (CD-ROM), Sep. 17-21, 2008.
  14. [14] T. Okawa, M. Miyaji, E. Sato, J. Nakazato, and T. Yamaguchi, “Information Support System using Intention Recognition Considering Situation in Human-Centered City,” the Int. Conf. on Advanced Robotics, pp. 455-460, 2007.

Creative Commons License  This article is published under a Creative Commons Attribution 4.0 International License.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Jul. 23, 2024