Paper:
Motion Overlap for a Mobile Robot to Express its Mind
Kazuki Kobayashi* and Seiji Yamada**
*Research Center for Human Media, Kwansei Gakuin University, 2-1 Gakuen, Sanda, Hyogo 669-1337, Japan
**National Institute of Informatics, 2-1-2 Hitotsubashi, Chiyoda, Tokyo 101-8430, Japan
- [1] S. Baron-Cohen, “Mindblindness: An Essay on Autism and Theory of Mind,” MIT Press, 1995.
- [2] C. Breazeal, “Regulation and entrainment for human-robot interaction,” International Journal of Experimental Robotics, Vol.21, No.11-12, pp. 883-902, 2002.
- [3] R. Brooks, C. Breazeal, M. Marjanovic, B. Scassellati, and M. Williamson, “The Cog Project: Building a Humanoid Robot,” Computation for Metaphors, Analogy and Agent, Lecture Notes in Computer Science, Vol.1562, pp. 52-87, 1999.
- [4] W. Burgard, A. B. Cremers, D. Fox, D. Hahnel, G. Lakemeyer, D. Schulz, W. Steiner, and S. Thrun, “The Interactive Museum Tour-Guide Robot,” In Proc. of the Fifteenth National Conf. on Artificial Intelligence, pp. 11-18, 1998.
- [5] J. J. Gibson, “The Ecological Approach to Visual Perception,” Lawrence Erlbaum Associates Inc., 1979.
- [6] S. Hashimoto et al., “Humanoid Robots in Waseda University – Hadaly-2 and WABIAN,” Autonomous Robots, Vol.12, No.1, pp. 25-38, 2002.
- [7] Japanese Industrial Standards (Ed.), “JIS S 0013:2002 Guidelines for the elderly and people with disabilities – Auditory signals on consumer products,” Japanese Industrial Standards, 2002.
- [8] Y. Katagiri and Y. Takeuchi, “Affective Minds,” chapter 21. Reciprocity and its Cultural Dependency in Human-Computer Interaction, Elsevier, 2000.
- [9] H. Kobayashi, Y. Ichikawa, M. Senda, and T. Shiiba, “Realization of Realistic and Rich Facial Expressions by Face Robot,” In IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 1123-1128, 2003.
- [10] K. Kobayashi and S. Yamada, “Human-Robot Cooperative Sweeping by Extending Commands Embedded in Actions,” In Proc. of 2005 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 1827-1832, 2005.
- [11] T. Komatsu, “Can we assign attitudes to a computer based on its beep sounds?,” In Proc. of the Affective Interactions: The computer in the affective loop Workshop at Intelligent User Interface 2005, pp. 35-37, 2005.
- [12] H. Kozima and H. Yano, “A robot that learns to communicate with human caregivers,” In Proc. of Int. Workshop on Epigenetic Robotics, pp. 47-52, 2001.
- [13] T. Matsumaru, K. Iwase, K. Akiyama, T. Kusada1, and T. Ito, “Mobile Robot with Eyeball Expression as the Preliminary-Announcement and Display of the Robot’s Following Motion,” Autonomous Robots, Vol.18, No.2, pp. 231-246, 2005.
- [14] T. Miyashita and H. Ishiguro, “Human-like natural behavior generation based on involuntary motions for humanoid robots,” Robotics and Autonomous Systems, Vol.48, No.4, pp. 203-212, 2003.
- [15] D. Morris, “Manwatching,” Elsevier Publishing Projects, 1977.
- [16] T. Nakata, T. Mori, and T. Sato, “Analysis of Impression of Robot Bodily Expression,” Journal of Robotics and Mechatronics, Vol.14, No.1, pp. 27-36, 2002.
- [17] T. Nakata, T. Sato, and T. Mori, “Expression of Emotion and Intention by Robot Body Movement,” In Intelligent Autonomous Systems 5, pp. 352-359, 1998.
- [18] D. A. Norman, “The Psychology of Everyday Things,” Basic Books, 1988.
- [19] M. Okada, S. Sakamoto, and N. Suzuki, “Muu: Artificial creatures as an embodied interface,” In Proc. of 27th Int. Conf. on Computer Graphics and Interactive Techniques (SIGGRAPH 2000), the Emerging Technologies: Point of Departure, p. 91, 2000.
- [20] T. Ono, M. Imai, and R. Nakatsu, “Reading a Robot’s Mind: A Model of Utterance Understanding based on the Theory of Mind Mechanism,” Int. Journal of Advanced Robotics, Vol.14, No.4, pp. 311-326, 2000.
- [21] B. Reeves and C. Nass, “The Media Equation: How People Treat Computers, Television, and New Media Like Real People and Places,” Cambridge University Press, 1996.
- [22] J. Searle, “Minds, brains, and programs,” Behavioral and Brain Sciences, Vol.3, No.3, pp. 417-457, 1980.
- [23] K. Severinson-Eklundh, A. Green, and H. Hüttenrauch, “Social and collaborative aspects of interaction with a service robot,” Robotics and Autonomous Systems, Vol.42, pp. 223-234, 2003.
- [24] T. Shibata, K. Wada, and K. Tanie, “Subjective Evaluation of Seal Robot in Brunei,” In IEEE Int. Workshop on Robot and Human Interactive Communication, pp. 135-140, 2004.
- [25] L. A. Suchman, “Plans and Situated Actions: The Problem of Human-Machine Communication,” Cambridge University Press, 1987.
- [26] K. Suzuki and M. Sasaki, “The Task Constraints on Selection of Potential Units of Action: An Analysis of Microslips Observed in Everyday Task,” Cognitive Studies, Vol.8, No.2, pp. 121-138, 2001 (in Japanese).
- [27] K. Wada, T. Shibata, T. Saito, and K. Tanie, “Psychological and Social Effects in Long-Term Experiment of Robot Assisted Activity to Elderly People at a Health Service Facility for the Aged,” In IEEE/RSJ Int. Conf. on Intelligent Robots and System, pp. 3068-3073, 2004.
- [28] K. Yamauchi and S. Iwamiya, “A Study on Functional Imagery and Onomatopoeic Representations of Auditory Signals Using Frequency-modulated tones,” In Proc. of Youngnam-Kyushu Joint Conf. on Acoustics, 2005.
This article is published under a Creative Commons Attribution-NoDerivatives 4.0 Internationa License.