single-jc.php

JACIII Vol.11 No.8 pp. 964-971
doi: 10.20965/jaciii.2007.p0964
(2007)

Paper:

Motion Overlap for a Mobile Robot to Express its Mind

Kazuki Kobayashi* and Seiji Yamada**

*Research Center for Human Media, Kwansei Gakuin University, 2-1 Gakuen, Sanda, Hyogo 669-1337, Japan

**National Institute of Informatics, 2-1-2 Hitotsubashi, Chiyoda, Tokyo 101-8430, Japan

Received:
March 14, 2007
Accepted:
May 23, 2007
Published:
October 20, 2007
Keywords:
interaction design, sweeping robot, robot mind, motion design, nonverbal communication
Abstract
This paper discusses how a mobile robot may express itself to get help from users in a cooperative task. We focus on a situation in which a robot expresses its state of mind to get a user to lend it help. The design we propose, called motion overlap (MO), enables a robot to express human-like behavior in communicating with others. We reasoned that human-like behavior in a robot could help the user to understand its state of mind. We designed a small sweeping robot based on MO that conducts back and forth movement, and compared its MO expression in experiments with other nonverbal communication, i.e., buzzers and blinking LEDs. We found that the MO expression encouraged most users to help the robot. Differences among results obtained for the three types of expression were statistically significant, and results demonstrate that MO has potential in the design of robots for the home.
Cite this article as:
K. Kobayashi and S. Yamada, “Motion Overlap for a Mobile Robot to Express its Mind,” J. Adv. Comput. Intell. Intell. Inform., Vol.11 No.8, pp. 964-971, 2007.
Data files:
References
  1. [1] S. Baron-Cohen, “Mindblindness: An Essay on Autism and Theory of Mind,” MIT Press, 1995.
  2. [2] C. Breazeal, “Regulation and entrainment for human-robot interaction,” International Journal of Experimental Robotics, Vol.21, No.11-12, pp. 883-902, 2002.
  3. [3] R. Brooks, C. Breazeal, M. Marjanovic, B. Scassellati, and M. Williamson, “The Cog Project: Building a Humanoid Robot,” Computation for Metaphors, Analogy and Agent, Lecture Notes in Computer Science, Vol.1562, pp. 52-87, 1999.
  4. [4] W. Burgard, A. B. Cremers, D. Fox, D. Hahnel, G. Lakemeyer, D. Schulz, W. Steiner, and S. Thrun, “The Interactive Museum Tour-Guide Robot,” In Proc. of the Fifteenth National Conf. on Artificial Intelligence, pp. 11-18, 1998.
  5. [5] J. J. Gibson, “The Ecological Approach to Visual Perception,” Lawrence Erlbaum Associates Inc., 1979.
  6. [6] S. Hashimoto et al., “Humanoid Robots in Waseda University – Hadaly-2 and WABIAN,” Autonomous Robots, Vol.12, No.1, pp. 25-38, 2002.
  7. [7] Japanese Industrial Standards (Ed.), “JIS S 0013:2002 Guidelines for the elderly and people with disabilities – Auditory signals on consumer products,” Japanese Industrial Standards, 2002.
  8. [8] Y. Katagiri and Y. Takeuchi, “Affective Minds,” chapter 21. Reciprocity and its Cultural Dependency in Human-Computer Interaction, Elsevier, 2000.
  9. [9] H. Kobayashi, Y. Ichikawa, M. Senda, and T. Shiiba, “Realization of Realistic and Rich Facial Expressions by Face Robot,” In IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 1123-1128, 2003.
  10. [10] K. Kobayashi and S. Yamada, “Human-Robot Cooperative Sweeping by Extending Commands Embedded in Actions,” In Proc. of 2005 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 1827-1832, 2005.
  11. [11] T. Komatsu, “Can we assign attitudes to a computer based on its beep sounds?,” In Proc. of the Affective Interactions: The computer in the affective loop Workshop at Intelligent User Interface 2005, pp. 35-37, 2005.
  12. [12] H. Kozima and H. Yano, “A robot that learns to communicate with human caregivers,” In Proc. of Int. Workshop on Epigenetic Robotics, pp. 47-52, 2001.
  13. [13] T. Matsumaru, K. Iwase, K. Akiyama, T. Kusada1, and T. Ito, “Mobile Robot with Eyeball Expression as the Preliminary-Announcement and Display of the Robot’s Following Motion,” Autonomous Robots, Vol.18, No.2, pp. 231-246, 2005.
  14. [14] T. Miyashita and H. Ishiguro, “Human-like natural behavior generation based on involuntary motions for humanoid robots,” Robotics and Autonomous Systems, Vol.48, No.4, pp. 203-212, 2003.
  15. [15] D. Morris, “Manwatching,” Elsevier Publishing Projects, 1977.
  16. [16] T. Nakata, T. Mori, and T. Sato, “Analysis of Impression of Robot Bodily Expression,” Journal of Robotics and Mechatronics, Vol.14, No.1, pp. 27-36, 2002.
  17. [17] T. Nakata, T. Sato, and T. Mori, “Expression of Emotion and Intention by Robot Body Movement,” In Intelligent Autonomous Systems 5, pp. 352-359, 1998.
  18. [18] D. A. Norman, “The Psychology of Everyday Things,” Basic Books, 1988.
  19. [19] M. Okada, S. Sakamoto, and N. Suzuki, “Muu: Artificial creatures as an embodied interface,” In Proc. of 27th Int. Conf. on Computer Graphics and Interactive Techniques (SIGGRAPH 2000), the Emerging Technologies: Point of Departure, p. 91, 2000.
  20. [20] T. Ono, M. Imai, and R. Nakatsu, “Reading a Robot’s Mind: A Model of Utterance Understanding based on the Theory of Mind Mechanism,” Int. Journal of Advanced Robotics, Vol.14, No.4, pp. 311-326, 2000.
  21. [21] B. Reeves and C. Nass, “The Media Equation: How People Treat Computers, Television, and New Media Like Real People and Places,” Cambridge University Press, 1996.
  22. [22] J. Searle, “Minds, brains, and programs,” Behavioral and Brain Sciences, Vol.3, No.3, pp. 417-457, 1980.
  23. [23] K. Severinson-Eklundh, A. Green, and H. Hüttenrauch, “Social and collaborative aspects of interaction with a service robot,” Robotics and Autonomous Systems, Vol.42, pp. 223-234, 2003.
  24. [24] T. Shibata, K. Wada, and K. Tanie, “Subjective Evaluation of Seal Robot in Brunei,” In IEEE Int. Workshop on Robot and Human Interactive Communication, pp. 135-140, 2004.
  25. [25] L. A. Suchman, “Plans and Situated Actions: The Problem of Human-Machine Communication,” Cambridge University Press, 1987.
  26. [26] K. Suzuki and M. Sasaki, “The Task Constraints on Selection of Potential Units of Action: An Analysis of Microslips Observed in Everyday Task,” Cognitive Studies, Vol.8, No.2, pp. 121-138, 2001 (in Japanese).
  27. [27] K. Wada, T. Shibata, T. Saito, and K. Tanie, “Psychological and Social Effects in Long-Term Experiment of Robot Assisted Activity to Elderly People at a Health Service Facility for the Aged,” In IEEE/RSJ Int. Conf. on Intelligent Robots and System, pp. 3068-3073, 2004.
  28. [28] K. Yamauchi and S. Iwamiya, “A Study on Functional Imagery and Onomatopoeic Representations of Auditory Signals Using Frequency-modulated tones,” In Proc. of Youngnam-Kyushu Joint Conf. on Acoustics, 2005.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Oct. 01, 2024