Relationship Between Mechadroid Type C3 and Human Beings Based on Physiognomic Features
Teruaki Ando*, Atsushi Araki*, Masayoshi Kanoh*,
Yutaro Tomoto**, and Tsuyoshi Nakamura**
*School of Information Science and Technology, Chukyo University, 101 Tokodachi, Kaizu-cho, Toyota 470-0393, Japan
**Graduate School of Engineering, Nagoya Institute of Technology, Gokiso-cho, Showa-ku, Nagoya 466-8555, Japan
-  L. Corman, “Visages et caracteres,” Presses Universitaires de France, 1985.
-  L. Corman, “Connaissance des enfants par la morphopsychologie,” Presses Universitaires de France, 2000.
-  K. Lorenz, “Studies in Animal and Human Behaviour: Volume 2,” Harvard University Press, 1970.
-  T. Yotsukura, H. Uchida, H. Yamada, S. Akamatsu, and S. Morishima, “A Micro-Temporal Analysis and Simulation of Facial Movements Spontaneously Elicited and Posed Expressions of Emotion,” Technical report of IEICE. HCS, Vol.101, No.333, pp. 39-46, 2001 (in Japanese).
-  M. Kanoh, S. Iwata, S. Kato, and H. Itoh, “Emotive Facial Expressions of Sensitivity Communication Robot “Ifbot”,” Kansei Engineering International, Vol.5, No.3, pp. 35-42, 2005.
-  D. Sakamoto, T. Kanda, T. Ono, H. Ishiguro, and N. Hagita, “Android as a telecommunication medium with human like presence,” 2nd ACM/IEEE Int. Conf. on Human-Robot Interaction (HRI2007), 2007.
-  T. Hashimoto, S. Hiramatsu, T. Tsuji, and H. Kobayashi, “Development of the Face Robot SAYA for Rich Facial Expressions,” SICEICASE Int. Joint Conf. 2006, pp. 5423-5428, 2006.
-  M. Kanoh, T. Nakamura, S. Kato, and H.Itoh, “Affective Facial Expressions Using Auto-associative Neural Network in Kansei Robot “Ifbot”,” book chapter in Kansei Engineering and Soft Computing: Theory and Practice, edited by Ying Dai, IGI Global, 2010.
-  A. Araki and M. Kanoh: “Physiognomic Feature Relationship Between Mechadroid Type C3 and Human Beings,” 28th Tokai Fuzzy Workshop, pp. 101-106, 2010 (in Japanese).
-  K. Jou, “The Visual Encyclopedia of Color,” Shinsei Publishing, 2009. (in Japanese)
This article is published under a Creative Commons Attribution-NoDerivatives 4.0 Internationa License.