JRM Vol.32 No.1 pp. 236-243
doi: 10.20965/jrm.2020.p0236


Interactive Information Support by Robot Partners Based on Informationally Structured Space

Shion Yamamoto*, Jinseok Woo**, Wei Hong Chin*, Keiichi Matsumura***, and Naoyuki Kubota*

*Graduate School of Systems Design, Tokyo Metropolitan University
6-6 Asahigaoka, Hino, Tokyo 191-0065, Japan

**Department of Mechanical Engineering, School of Engineering, Tokyo University of Technology
1404-1 Katakura, Hachioji, Tokyo 192-0982, Japan

***INFITECHM Co., Ltd.
1-15-27 Kounandai, Kounan-ku, Yokohama, Kanagawa 234-0054, Japan

June 7, 2019
January 10, 2020
February 20, 2020
robot partners, informationally structured space, information support, community-centric systems
Interactive Information Support by Robot Partners Based on Informationally Structured Space

The robot partner for information support

Recently, there has been an increase in the importance of community-centric systems as a new paradigm to enhance the quality of community (QOC). Social media plays an important role in creating, sharing, and exchanging information within a community. However, assistive technologies should be developed from human-centric and community-centric points of view to realize such information support. In this paper, we discuss the use of smart devices interlocked robot partners for interactive information support similar to concierge services in hotels. The interactive information support system is composed of two main subsystems, namely robot partners and informationally structured space servers. A robot partner performs communication and interaction with people through voice recognition and gesture recognition in addition to the use of touch interfaces. The informationally structured space server receives the measurement data of human motions and personal information containing human requests and preferences from the robot partner. Next, the informationally structured space server selects and recommends shops, restaurants, and sightseeing spots to the guests and visitors through utterance and display by the robot partner. First, we explain the concept of the informationally structured space to connect a person with sensory information and propose the overall architecture of the informationally structured space. Next, we explain how to provide information support using smart devices interlocked robot partners based on the informationally structured space. In addition, we describe several social experiments on interactive information support at hotels. Finally, we discuss the effectiveness of the proposed system and the future direction of community-centric systems.

Cite this article as:
S. Yamamoto, J. Woo, W. Chin, K. Matsumura, and N. Kubota, “Interactive Information Support by Robot Partners Based on Informationally Structured Space,” J. Robot. Mechatron., Vol.32, No.1, pp. 236-243, 2020.
Data files:
  1. [1] M. Pollack, “Intelligent Technology for an Aging Population: The Use of AI to Assist Elders with Cognitive Impairment,” AI Magazine, Vol.26, pp. 9-24, 2005.
  2. [2] T. Hashimoto, N. Kato, and H. Kobayashi, “Study on Educational Application of Android Robot SAYA: Field Trial and Evaluation at Elementary School,” Proc. of Int. Conf. on Intelligent Robotics and Applications (ICIRA 2010), pp. 505-516, 2010.
  3. [3] H. Ishiguro, M. Shiomi, T. Kanda, D. Eaton, and N. Hagita, “Field Experiment in a Science Museum with communication robots and a ubiquitous sensor network,” Proc. of Workshop on Network Robot System at ICRA2005, 2005.
  4. [4] S. Sun, T. Takeda, H. Koyama, and N. Kubota, “Smart device interlocked robot partners for information support systems in sightseeing guide,” Proc. of 2016 Joint 8th Int. Conf. on Soft Computing and Intelligent Systems (SCIS) and 17th Int. Symp. on Advanced Intelligent Systems (ISIS), pp. 586-590, 2016.
  5. [5] N. Kubota and K. Nishida, “Cooperative Perceptual Systems for Partner Robots Based on Sensor Network,” Int. J. of Computer Science and Network Security (IJCSNS), Vol.6, No.11, pp. 19-28, 2006.
  6. [6] N. Kubota and A. Yorita, “Structured Learning for Partner Robots based on Natural Communication,” Proc. of 2008 IEEE Conf. on Soft Computing in Industrial Applications, pp. 303-308, 2008.
  7. [7] N. Kubota, S. Omote, and Y. Mori, “Emotional Learning of A Vision-Based Partner Robot for Natural Communication with Human,” Proc. of 2006 IEEE World Congress on Computational Intelligence (WCCI 06), Vancouver, Canada, July 16-21, pp. 6288-6294, 2006.
  8. [8] N. Kubota and K. Nishida, “Perceptual Control Based on Prediction for Natural Communication of A Partner Robot,” IEEE Trans. on Industrial Electronics, Vol.54, No.2, pp. 866-877, 2007.
  9. [9] J. Woo and N. Kubota, “Human-Robot Interaction Design Using Smart Device Based Robot Partner,” Int. J. of Artificial Life Research (IJALR), Vol.6, No.2, pp. 23-43, 2016.
  10. [10] P. Remagnino, H. Hagras, N. Monekosso, and S. Velastin, “Ambient Intelligence: A Gentle Introduction,” P. Remagnino, G. Foresti, and T. Ellis (Eds.), “Ambient Intelligence A Novel Paradigm,” Springer Verlag, pp. 1-15, 2005.
  11. [11] M. J. Akhlaghinia et al., “Occupant behaviour prediction in ambient intelligence computing environment,” J. of Uncertain Systems, Vol.2, No.2, pp. 85-100, 2008.
  12. [12] K. Morioka and H. Hashimoto, “Appearance Based Object Identification for Distributed Vision Sensors in Intelligent Space,” Proc. of the 2004 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS’04), Vol.1, pp. 199-204, 2004.
  13. [13] H. Hashimoto, “Intelligent space: Interaction and intelligence,” Artificial Life and Robotics, Vol.7, No.3, pp. 79-85, 2003.
  14. [14] V. Callaghan, M. Colley, H. Hagras, J. Chin, F. Doctor, and G. Clarke, “Programming iSpaces: A Tale of Two Paradigms,” A. Steventon and S. Wright (Eds.), “Intelligent Spaces: The Application of Pervasive ICT,” Springer-Verlag, Chapter 24, pp. 389-421, 2005.
  15. [15] W. Mito and M. Matsunaga, “Cloud/Crowd Sensing System for Annotating Users Perception,” J. Robot. Mechatron., Vol.28, No.1, pp. 61-68, 2016.
  16. [16] T. Linner, J. Guettler, T. Bock, and C. Georgoulas, “Assistive Robotic Micro-Rooms for Independent Living,” Automation in Construction, Vol.51, pp. 8-22, 2015.
  17. [17] T. Bock et al., “Ambient Integrated Robotics: Automation and Robotic Technologies for Maintenance, Assistance, and Service,” Cambridge University Press, 2019.
  18. [18] C. Georgoulas, J. Güttler, T. Linner, and T. Bock, “A Mechatronic Wall for Assistance with ADLs,” J. Robot. Mechatron., Vol.27, No.1, p. 107, 2015.
  19. [19] H. Kimura, N. Kubota, and J. Cao, “Natural Communication for Robot Partners Based on Computational Intelligence for Edutainment,” Proc. of Mecatronics 2010, pp. 610-615, 2010.
  20. [20] E. Sato-Shimokawara and T. Yamaguchi, “Community-Centric System – Support of Human Ties –,” J. Robot. Mechatron., Vol.29, No.1, pp. 7-13, 2017.
  21. [21] N. Kubota and H. Liu, “Special Issue on Computational Intelligence for Community-Centric Systems [Guest Editorial],” IEEE Computational Intelligence Magazine, Vol.9, Issue 2, pp. 15-17, 2014.
  22. [22] Y. Shimizu, S. Yoshida, J. Shimazaki, and N. Kubota, “An Interactive Support System for Activating Shopping Streets using Robot Partners in Informationally Structured Space,” Proc. of 2013 IEEE Workshop on Advanced Robotics and its Social Impacts, Shibaura, Tokyo, Japan, Nos.7-9, pp. 70-75, 2013.
  23. [23] D. Tang, J. Botzheim, and N. Kubota, “Informationally Structured Space for Community-centric Systems,” The 2nd Int. Conf. on Universal Village (UV2014), Boston, USA, June 16-17, 2014.
  24. [24] N. Kubota and K. Nishida, “Development of Internal Models for Communication of A Partner Robot Based on Computational Intelligence,” Proc. of 6th Int. Symp. on Advanced Intelligent Systems (ISIS 2005), Yeosu, Korea, September 28 to October 1, pp. 577-582, 2005.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Aug. 09, 2020