single-jc.php

JACIII Vol.17 No.4 pp. 480-492
doi: 10.20965/jaciii.2013.p0480
(2013)

Paper:

Evaluating Instantaneous Psychological Stress from Emotional Composition of a Facial Expression

Suvashis Das and Koichi Yamada

Department of Management and Information Systems Science, Nagaoka University of Technology, 1603-1 Kamitomiokamachi, Nagaoka, Niigata 940-2137, Japan

Received:
December 16, 2012
Accepted:
April 10, 2013
Published:
July 20, 2013
Keywords:
psychological stress, emotion, FACS, hidden markov model (HMM)
Abstract
Human psychological stress is a vast and highly complicated topic of study and research. The types and kinds of stress observed in humans vary among researchers. Also, to identify stress, many methods exist. Most of these methods are non-intrusive and are based on self-reporting and questionnaires which reduces the real-time efficacy of the procedure. Intrusive methods are, on the other hand, time consuming and cumbersome. The total problem of non-intrusive psychological stress detection from facial images can be visualized in three incremental stages: instantaneous analysis of subject, historical analysis of subject, and the subject’s environmental analysis. In this paper, we deal with instantaneous analysis of a subject. This means that the stress behavior of a subject is predicted for one moment of time using an image of his/her facial expression. In order to do so, we have conducted two surveys to establish the relationship between emotional compositions of a facial expression with stress and also to establish the relationship of individual emotions with stress. The novelty of the paper is 1) to establish relationships between the seven basic emotions (anger, contempt, disgust, fear, happy, sad, and surprise) and stress, 2) to establish relationship between emotional composition of a facial expression and stress, and 3) to predict a formula for evaluating stress in terms of emotional percentage mixture of a facial expression. In order to achieve the three goals, we use Facial Action Unit (AU) [1] coded image data to predict the emotional mixture of the facial expression in terms of the seven basic emotion percentages. An AU represents one of the many basic muscle movements that make up the facial expression. Then we analyze the survey outcomes to establish the relationship between individual emotions and stress. Finally we correlate the survey outcomes with the emotional mixture data obtained from the facial expression using Hidden Markov Model (HMM) approach to both establish a relationship of emotional composition with stress and to predict a formula for stress in terms of the seven basic emotion percentages jointly.
Cite this article as:
S. Das and K. Yamada, “Evaluating Instantaneous Psychological Stress from Emotional Composition of a Facial Expression,” J. Adv. Comput. Intell. Intell. Inform., Vol.17 No.4, pp. 480-492, 2013.
Data files:
References
  1. [1] P. Ekman and W. V. Friesen, “Facial Action Coding System: Investigator’s Guide,” Consulting Psychologists Press, 1978.
  2. [2] K. Dai, H. J. Fell, and J. MacAuslan, “Recognizing emotion in speech using neural networks,” Proc. of the IASTED Int. Conf. on Telehealth/Assistive Technologies, Ronald Merrell (Ed.), pp. 31-36, 2008.
  3. [3] L. R. Rabiner, “A tutorial on Hidden Markov Models and selected applications in speech recognition,” Proc. of the IEEE, Vol.77, No.2, pp. 257-286, 1989.
  4. [4] A. S. AlMejrad, “Human Emotions Detection using Brain Wave Signals: A Challenging,” European J. of Scientific Research, Vol.44, No.4, pp. 640-659, 2010.
  5. [5] F. H. Wilhelm, M. C. Pfaltz, and P. Grossman, “Continuous electronic data capture of physiology, behavior and experience in real life: towards ecological momentary assessment of emotion,” Interacting with Computers, Vol.18, Iss. 2, pp. 171-186, 2006.
  6. [6] S. K. Yoo, C. K. Lee, and Y. J. Park, “Determination of Biological Signal for Emotion Identification,” World Congress on Medical Physics and Biomedical Engineering, IFMBE Proc., Vol.14, No.6, pp. 4047-4049, 2007.
  7. [7] T. Hu, L. C. De Silva, and K. Sengupta, “A hybrid approach of NN and HMM for facial emotion classification,” Pattern Recognition Letters, Vol.23, Iss.11, pp. 1303-1310, Sep. 2002.
  8. [8] I. Cohen, A. Garg, and T. S. Huang, “Emotion Recognition from Facial Expressions using Multilevel HMM,” Science and Technology, Citeseer, 2000.
  9. [9] K. Mase, “Recognition of facial expression from optical flow,” IEICE Trans., Vol.E74, No.10, pp. 3474-3483, 1991.
  10. [10] T. Otsuka and J. Ohya, “Recognition of Facial Expressions Using HMM with Continuous Output Probabilities,” Proc. Int. Workshop Robot and Human Comm., pp. 323-328, 1996.
  11. [11] C. Darwin, “The Expression of the Emotion of man and animals,” D. Appleton & Co., New York, 1898.
  12. [12] R. S. Lazarus, “From psychological stress to the emotions: a history of changing outlooks,” Annual Review of Psychology, Vol.44, pp. 1-21, 1993.
  13. [13] R. Hassin and Y. Trope, “Facing faces: Studies on the cognitive aspects of physiognomy,” J. of Personality and Social Psychology, Vol.78, pp. 837-852, 2000.
  14. [14] A. C. Little and D. I. Perrett, “Using composite images to assess accuracy in personality attribution to faces,” British J. of Psychology, Vol.98, pp. 111-126, 2007.
  15. [15] D. Keltner and P. Ekman, “Introduction: expression of emotion,” Handbook of affective sciences, pp. 411-414, 2003.
  16. [16] B. C. Jones et al., “Facial symmetry and judgements of apparent health – Support for a “good genes” explanation of the attractiveness-symmetry relationship,” Evolution and Human Behavior, Vol.22, pp. 417-429, 2001.
  17. [17] A. C. Little et al., “Accuracy in assessment of self-reported stress and a measure of health from static facial information,” Personality and Individual Differences, Vol.51, Iss.6, pp. 693-698, 2011.
  18. [18] D. F. Dinges et al., “Optical Computer Recognition of Facial Expressions Associated with Stress Induced by Performance Demands,” Aviation, Space and Environmental Medicine, Vol.76, Supplement 1, pp. B172-B182, 2005.
  19. [19] M. Nübling et al., “Measuring psychological stress and strain at work: evaluation of the COPSOQ I Questionnaire in Germany,” GMS Psycho-Social-Medicine, Vol.3, pp. 1-14. 2006.
  20. [20] M. Horowitz et al., “Life event questionnaires for measuring presumptive stress,” Psychosomatic Medicine, Vol.39, pp. 413-431, 1977.
  21. [21] L. Lemyre and R. Tessier, “Measuring psychological stress. Concept, model, and measurement instrument in primary care research,” Canadian Family Physician, Vol.49, pp. 1159-1160 and pp. 1166-1168, 2003.
  22. [22] S. Nomura, “Kansei’s Physiological Measurement and Its application (1): Salivary Biomarkers as a New Metric for Human Mental stress,” Kansei Engineering and Soft Computing: Theory and Practice, pp. 303-318, 2011.
  23. [23] J. Bradbury, “Modelling Stress Constructs with Biomarkers: The Importance of the Measurement Model,” Clinical and Experimental Medical Sciences, Vol.1, No.3, pp. 197-216, 2013.
  24. [24] R. Subhani, L. Xia, and A. S. Malik, “EEG SIGNALSTO MEASURE MENTAL STRESS,” 2nd Int. Conf. on Behavioral, Cognitive and Psychological Sciences, 2011.
  25. [25] T. Kanade, J. Cohn, and Y. L. Tian, “Comprehensive database for facial expression analysis,” Proc. of the 4th IEEE Int. Conf. on Automatic Face and Gesture Recognition, pp. 46-53, 2000.
  26. [26] P. Lucey et al., “The Extended Cohn-Kanade Dataset (CK+): A complete expression dataset for action unit and emotion-specified expression,” Proc. of the 3rd Int. Workshop on CVPR for Human Communicative Behavior Analysis, pp. 94-101, 2010.
  27. [27] P. Ekman, W. V. Friesen, and J. C. Hager, “The New Facial Action Coding System (FACS),” Research Nexus division of Network Information Research Corporation, 2002.
  28. [28] M. Pantic and L. J. M. Rothkrantz, “Automatic Analysis of Facial Expressions: The State of the Art,” IEEE Trans. on Pattern Analysis and Machine Intelligence, pp. 1424-1445, 2000.
  29. [29] S. Das and K. Yamada, “An HMM based Model for Prediction of Emotional Composition of a Facial Expression using both Significant and Insignificant Action Units and Associated Gender Differences,” Int. J. of Computer Applications, Vol.45, No.11, pp. 11-18, 2012.
  30. [30] P. Salmon, “Effects of physical exercise on anxiety, depression, and sensitivity to stress: A unifying theory,” Clinical Psychology Review, Vol.21, Iss.1, pp. 33-61, Feb. 2001.
  31. [31] J. Gruber, I. B. Mauss, and M. Tamir, “A dark side of happiness? How, when, and why happiness is not always good,” Perspectives on Psychological Science, Vol.6, No.3, p. 222, 2011.
  32. [32] M. S. Bartlett et al., “Fully automatic facial action recognition in spontaneous behavior,” Int. Conf. on Automatic Face and Gesture Recognition, pp. 223-230, IEEE, 2006.
  33. [33] S. Spiegelman and J. M. Reiner, “A Note on Steady States and the Weber-Fechner Law,” Psychometrika, Vol.10, No.1, pp. 27-35, 1945.
  34. [34] V. D. Glezer, “The Meaning of the Weber-Fechner Law and Description of Scenes in Terms of Neural Networks,” Human Physiology, Vol.33, No.3, pp. 257-266, 2007.
  35. [35] V. D. Glezer, “The Meaning of the Weber-Fechner Law: IV. The Psychometric Curve and Interhemispheric and Intrahemispheric Interactions,” Human Physiology, Vol.37, No.1, pp. 57-65, 2011.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 19, 2024