single-jc.php

JACIII Vol.14 No.7 pp. 869-876
doi: 10.20965/jaciii.2010.p0869
(2010)

Paper:

Relationship Between Mechadroid Type C3 and Human Beings Based on Physiognomic Features

Teruaki Ando*, Atsushi Araki*, Masayoshi Kanoh*,
Yutaro Tomoto**, and Tsuyoshi Nakamura**

*School of Information Science and Technology, Chukyo University, 101 Tokodachi, Kaizu-cho, Toyota 470-0393, Japan

**Graduate School of Engineering, Nagoya Institute of Technology, Gokiso-cho, Showa-ku, Nagoya 466-8555, Japan

Received:
April 13, 2010
Accepted:
June 29, 2010
Published:
November 20, 2010
Keywords:
facial expressions, physiognomic features, principal component analysis, cluster analysis, Mechadroid Type C3
Abstract

In this paper, we created random facial expressions for the Mechadroid Type C3, a robot equipped with a high degree-of-freedom facial expression mechanism and which is intended to serve a receptionist function. Investigating the morphological characteristics and physiognomy features of these facial expressions, we evaluated what personality characteristics could be expressed by the face of the C3 and what impressions those facial expressions made on people. As a result, it was found that a baby-schema-cute face, modest face, and smiley face are the most suitable as the physiognomy of a reception robot.

Cite this article as:
Teruaki Ando, Atsushi Araki, Masayoshi Kanoh,
Yutaro Tomoto, and Tsuyoshi Nakamura, “Relationship Between Mechadroid Type C3 and Human Beings Based on Physiognomic Features,” J. Adv. Comput. Intell. Intell. Inform., Vol.14, No.7, pp. 869-876, 2010.
Data files:
References
  1. [1] L. Corman, “Visages et caracteres,” Presses Universitaires de France, 1985.
  2. [2] L. Corman, “Connaissance des enfants par la morphopsychologie,” Presses Universitaires de France, 2000.
  3. [3] K. Lorenz, “Studies in Animal and Human Behaviour: Volume 2,” Harvard University Press, 1970.
  4. [4] T. Yotsukura, H. Uchida, H. Yamada, S. Akamatsu, and S. Morishima, “A Micro-Temporal Analysis and Simulation of Facial Movements Spontaneously Elicited and Posed Expressions of Emotion,” Technical report of IEICE. HCS, Vol.101, No.333, pp. 39-46, 2001 (in Japanese).
  5. [5] M. Kanoh, S. Iwata, S. Kato, and H. Itoh, “Emotive Facial Expressions of Sensitivity Communication Robot “Ifbot”,” Kansei Engineering International, Vol.5, No.3, pp. 35-42, 2005.
  6. [6] D. Sakamoto, T. Kanda, T. Ono, H. Ishiguro, and N. Hagita, “Android as a telecommunication medium with human like presence,” 2nd ACM/IEEE Int. Conf. on Human-Robot Interaction (HRI2007), 2007.
  7. [7] T. Hashimoto, S. Hiramatsu, T. Tsuji, and H. Kobayashi, “Development of the Face Robot SAYA for Rich Facial Expressions,” SICEICASE Int. Joint Conf. 2006, pp. 5423-5428, 2006.
  8. [8] M. Kanoh, T. Nakamura, S. Kato, and H.Itoh, “Affective Facial Expressions Using Auto-associative Neural Network in Kansei Robot “Ifbot”,” book chapter in Kansei Engineering and Soft Computing: Theory and Practice, edited by Ying Dai, IGI Global, 2010.
  9. [9] A. Araki and M. Kanoh: “Physiognomic Feature Relationship Between Mechadroid Type C3 and Human Beings,” 28th Tokai Fuzzy Workshop, pp. 101-106, 2010 (in Japanese).
  10. [10] K. Jou, “The Visual Encyclopedia of Color,” Shinsei Publishing, 2009. (in Japanese)

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Oct. 15, 2021