Paper:
Concept of Fuzzy Atmosfield for Representing Communication Atmosphere and its Application to Humans-Robots Interaction
Zhen-Tao Liu*,**, Min Wu**, Dan-Yun Li**, Lue-Feng Chen*, Fang-Yan Dong*, Yoichi Yamazaki***, and Kaoru Hirota*
*Department of Computational Intelligence and Systems Science, Tokyo Institute of Technology, G3-49, 4259 Nagatsuta, Midori-ku, Yokohama, Kanagawa 226-8502, Japan
**School of Information Science and Engineering, Central South University, Yuelu Mountain, Changsha, Hunan 410083, China
***Department of Electrical, Electronic, and Information Engineering, Kanto Gakuin University, 1-50-1 Mutsuura-higashi, Kanazawa-ku, Yokohama, Kanagawa 236-8501, Japan
- [1] R.W. Picard, “Affective Computing: Challenges,” Int. J. of Human-Computer Studies, Vol.59, No.12, pp. 5564, 2003.
- [2] J. L. Burke, R. R. Murphy et al., “Final Report for the DARPA/NSF Interdisciplinary Study on HumanRobot Interaction,” IEEE Trans. on Systems, Man, and Cybernetics – Part C: Applications and Reviews, Vol.34, No.2, pp. 103-112, 2004.
- [3] K. Hirota and F.-Y. Dong, “Development of Mascot Robot System in NEDO project,” IEEE Int. Conf. on Intelligent Systems, Varna, Bulgaria, 2008.
- [4] Y. Yamazaki, H. A. Vu et al., “Gesture Recognition Using Combination of Acceleration Sensor and Images for Casual Communication Between Robots and Humans,” IEEE World Congress on Evolutionary Computation, Barcelona, Spain, 2010.
- [5] V. Akre, E. Falkum et al., “The Communication Atmosphere Between Physician Colleagues Competitive Perfectionism or Supportive Dialogue A Norwegian Study,” Social Science & Medicine, Vol.44, No.4, pp. 519-526, 1997.
- [6] T.M. Rutkowski and D. P.Mandic, “Modelling the Communication Atmosphere: A Human Centered Multimedia Approach to Evaluate Communicative Situations,” Artificial Intelligence for Human Computing, Springer-Verlag, Berlin, Vol.4451, pp. 155-169, 2007.
- [7] T. M. Rutkowski, K. Kakusho et al., “Evaluation of the Communication Atmosphere,” Knowledge-Based Intelligent Information and Engineering Systems, Springer-Verlag, Berlin, Vol.3213, pp. 364-370, 2004.
- [8] B. Anderson, “Affective Atmospheres,” Emotion, Space and Society, Vol.2, No.2, pp. 77-81, 2009.
- [9] Y. Yamazaki, Y. Hatakeyama et al., “Fuzzy Inference BasedMentality Expression for Eye Robot in Affinity Pleasure-Arousal Space,” J. of Advanced Computational Intelligence and Intelligent Informatics, Vol.12, No.3, pp. 304-313, 2008.
- [10] Z.-T. Liu, M. Wu et al., “Emotional States Based 3-D Fuzzy Atmosfield for Casual Communication Between Humans and Robots,” IEEE Int. Conf. on Fuzzy Systems, Taipei, Taiwan, 2011.
- [11] D. J. Stanley and J. P. Meyer, “Two-Dimensional Affective Space: A New Approach to Orienting the Axes,” Emotion, Vol.9, No.2, pp. 214-237, 2009.
- [12] C. Breazeal, “Emotion and Sociable Humanoid Robots,” Int. J. of Human-Computer Studies, Vol.59, No.12, pp. 119-155, 2003.
- [13] P. B. Byl, “A Six-Dimensional Paradigm for Generating Emotions in Virtual Characters,” Int. J. of Intelligent Games & Simulation Vol.2, No.2, pp. 72-79, 2003.
- [14] Z.-T. Liu, F.-Y. Dong et al., “Proposal of Fuzzy Atmosfield for Mood Expression of Human-Robot Communication,” Int. Symp. on Intelligent Systems, Tokyo, Japan, 2010.
- [15] S. Hein, “Feeling Words.”
http://eqi.org/fw.htm - [16] J. M. Lattin, J. D. Carroll et al., “Analyzing Multivariate Data,” Duxbury Press, CA, 2002.
- [17] P. Ahlgren, B. Jarneving et al., “Requirements for A Cocitation Similarity Measure, with Special Reference to Pearson’s Correlation Coefficient,” J. of the American Society for Information Science and Technology, Vol.54, No.6, pp. 550-560, 2003.
- [18] E. Cox, “Fuzzy Fundamentals,” IEEE Spectrum, Vol.29, No.10, pp. 58-61, 1992.
- [19] J. A. Russell, A. Weiss et al., “Affect Grid: A Singleitem Scale of Pleasure and Arousal,” J. of Personality and Social Psychology, Vol.57, No.3, pp. 493-502, 1989.
- [20] L. Sussman, “Verbal communication.”
http://cobweb2.louisville.edu/faculty/regbruce/bruce/mgmtwebs/commun f98/Verbal.htm - [21] M. Pavlova, A. A. Sokolov et al., “Perceived Dynamics of Static Images Enables Emotional Attribution,” Perception, Vol.34, pp. 1107-1116, 2005.
- [22] H. Gunes and M. Piccardi, “Bi-Modal Emotion Recognition from Expressive Face and Body Gestures,” J. of Network and Computer Applications, Vol.30, No.4, pp. 1334-1345, 2007.
- [23] N. Kaya and H. H. Epps, “Relationship Between Color and Emotion: A Study of College Students,” College Student Journal, Vol.38, pp. 396-405, 2004.
- [24] J. H. Xin, K. M. Cheng et al., “Cross-Regional Comparison of Colour Emotions Part I Quantitative Analysis,” Color Research & Application, Vol.29, No.6, pp. 451-457, 2004.
- [25] P. Ekman, “Are there basic emotions?,” Psychological Review, Vol.99, No.3, pp. 550-553, 1992.
- [26] S. Zhang, Y. Xu et al., “Analysis and Modeling of Affective Audio Visual Speech Based on PAD Emotion Space,” The 6th Int. Symp. on Chinese Spoken Language Processing, Kunming, China, 2008.
- [27] C. Breazeal, “Affective Interaction Between Humans and Robots,” Advances in Artificial Life, Springer-Verlag, Berlin, Vol.2159, pp. 582-591, 2001.
- [28] H. Miwa, T. Umetsu et al., “Robot Personalization Based on the Mental Dynamics,” IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, Takamatsu, Japan, 2000.
- [29] M. Grimm, E. Mower et al., “Primitives-Based Evaluation and Estimation of Emotions in Speech,” Speech Communication, Vol.49, No.10-11, pp. 787-800, 2007.
This article is published under a Creative Commons Attribution-NoDerivatives 4.0 Internationa License.