single-jc.php

JACIII Vol.19 No.1 pp. 118-126
doi: 10.20965/jaciii.2015.p0118
(2015)

Paper:

Classification of Evoked Emotions Using an Artificial Neural Network Based on Single, Short-Term Physiological Signals

Shanbin Zhang*, Guangyuan Liu*,†, and Xiangwei Lai**

*College of Electronic and Information Engineering, Southwest University, No.2 of Tiansheng Road, BeiBei District, Chongqing 400715, China

**Computer and Information Science College, Southwest University, No.2 of Tiansheng Road, BeiBei District, Chongqing 400715, China

Corresponding author

Received:
April 27, 2014
Accepted:
October 24, 2014
Published:
January 20, 2015
Keywords:
emotional recognition, ANN, automatic recognition, ECG, GSR
Abstract
Most automated analysis methods related to biosignalbased human Emotions collect their data using multiple physiological signals, long-term physiological signals, or both. However, this restricts their ability to identify Emotions in an efficient manner. This study classifies evoked Emotions based on two types of single, short-term physiological signals: electrocardiograms (ECGs) and galvanic skin responses (GSRs) respectively. Estimated recognition times are also recorded and analyzed. First, we perform experiments using film excerpts selected to elicit target Emotions that include anger, grief, fear, happiness, and calmness; ECG and GSR signals are collected during these experiments. Next, a wavelet transform is applied to process the truncated ECG data, and a Butterworth filter is applied to process the truncated GSR signals, in order to extract the required features. Finally, the five different Emotion types are classified by employing an artificial neural network (ANN) based on the two signals. Average classification accuracy rates of 89.14% and 82.29% were achieved in the experiments using ECG data and GSR data, respectively. In addition, the total time required for feature extraction and emotional classification did not exceed 0.15 s for either ECG or GSR signals.
Cite this article as:
S. Zhang, G. Liu, and X. Lai, “Classification of Evoked Emotions Using an Artificial Neural Network Based on Single, Short-Term Physiological Signals,” J. Adv. Comput. Intell. Intell. Inform., Vol.19 No.1, pp. 118-126, 2015.
Data files:
References
  1. [1] R. W. Picard, “Affective computing,” MIT press, 2000.
  2. [2] S. Luo and L. Pan, “Theory and Technology on Affective Computing,” Systems Engineering and Electronics, Vol.25, No.7, pp. 905-909, 2003.
  3. [3] K. Anderson and P. W. McOwan, “A real-time automated system for the recognition of human facial expressions,” IEEE Trans. on Systems, Man, and Cybernetics, Part B: Cybernetics, Vol.36, No.1, pp. 96-105, 2006.
  4. [4] Z. Li et al., “Intelligent facial emotion recognition and semanticbased topic detection for a humanoid robot,” Expert Systems with Applications, Vol.40, No.13, pp. 5160-5168, 2013.
  5. [5] S. Lucey, A. B. Ashraf, and J. Cohn, “Investigating spontaneous facial action recognition through aam representations of the face,” Face recognition, pp. 275-286, 2007.
  6. [6] J.Whitehill and C.W. Omlin, “Haar features for FACS AU recognition,” IEEE Int. Conf. on Automatic Face and Gesture Recognition, 2006.
  7. [7] P. Ekman, R. W. Levenson, and W. V. Friesen, “Autonomic nervous system activity distinguishes among emotions,” Science, Vol.221, No.4616, pp. 1208-1210, 1983.
  8. [8] W. M. Winton, L. E. Putnam, and R. M. Krauss, “Facial and autonomic manifestations of the dimensional structure of emotion,” J. of Experimental Social Psychology Vol.20, No.3, pp. 195-216, 1984.
  9. [9] R. W. Picard, E. Vyzas, and J. Healey, “Toward machine emotional intelligence: Analysis of affective physiological state,” IEEE Trans. on Pattern Analysis andMachine Intelligence, Vol.23, No.10, pp. 1175-1191, 2001.
  10. [10] C. L. Lisetti and F. Nasoz, “Using noninvasive wearable computers to recognize human emotions from physiological signals,” EURASIP J. on Applied Signal Processing, pp. 1672-1687, 2004.
  11. [11] A. Haag et al., “Emotion recognition using bio-sensors: First steps towards an automatic system,” Affective dialogue systems, Springer Berlin Heidelberg, pp. 36-48, 2004.
  12. [12] J. Wagner, J. Kim, and E. André, “From physiological signals to emotions: Implementing and comparing selected methods for feature extraction and classification,” IEEE Int. Conf. on Multimedia and Expo, 2005.
  13. [13] K. H. Kim, S. W. Bang, and S. R. Kim, “Emotion recognition system using short-term monitoring of physiological signals,” Medical and biological engineering and computing, Vol.42, No.3, pp. 419-427, 2004.
  14. [14] C. D. Katsis, N. S. Katertsidis, and D. I. Fotiadis, “An integrated system based on physiological signals for the assessment of affective states in patients with anxiety disorders,” Biomedical Signal Processing and Control, Vol.6, No.3, pp. 261-268, 2011.
  15. [15] B. Cheng and G. Liu, “Emotion recognition from surface EMG signal using wavelet transform and neural network,” J. of Computer Applications, Vol.28, No.2, pp. 333-335, 2008.
  16. [16] W. Boucsein, “Electrodermal activity,” Springer, 2012.
  17. [17] P. Zheng, C. Liu, and G. Yu, “An Overview of Mood-induction Methods,” Advances in Psychological Science, Vol.20, No.1, pp. 45-55, 2012.
  18. [18] C. Ma and G. Liu, “Feature Extraction, Feature Selection and Classification from Electrocardiography to Emotions,” IEEE Int. Conf. on Computational Intelligence and Natural Computing, Vol.1, 2009.
  19. [19] P. Ranjith, P. C. Baby, and P. Joseph, “ECG analysis using wavelet transform: application to myocardial ischemia detection,” ITBMRBM, Vol.24, No.1, pp. 44-47, 2003.
  20. [20] G. Li and W. Lv, “Wavelet Transform Based Electrocardiology Signal Analyses and Processing,” J. of Zhejiang University (Natural Science), Vol.32, No.1, pp. 82-87, 1998.
  21. [21] F. N. Ucar, M. Korurek, and E. Yazgan, “A noise reduction algorithm in ECG signals using wavelet transform,” Proc. of the 1998 2nd Int. Conf. on Biomedical Engineering Days, 1998.
  22. [22] J. Cao, “The Research of Affective State Recognition from Electrocardiography Signal Based on QPSO Algorithm,” MS thesis, Southwest University, 2012 (in Chinese).
  23. [23] Y. L. Loukas, “Artificial neural networks in liquid chromatography: efficient and improved quantitative structure-retention relationship models,” J. of Chromatography, Vol.A904, No.2, pp. 119-129, 2000.
  24. [24] E. C. Ezin, “Pyramidal Structure Algorithm for Fingerprint Classification Based on Artificial Neural Networks,” J. of Computational Intelligence and Intelligent Informatics, Vol.14, No.1, pp. 63-68, 2009.
  25. [25] A. Khashman, “An emotional system with application to blood cell type identification,” Trans. of the Institute of Measurement and Control, Vol.34, pp. 125-147, 2012.
  26. [26] K. Yamada, “Network Parameter Setting for Reinforcement Learning Approaches Using Neural Networks,” J. of Computational Intelligence and Intelligent Informatics, Vol.15, No.7, pp. 822-830, 2011.
  27. [27] J. Rybka and A. Janicki, “Comparison of speaker dependent and speaker independent emotion recognition,” Int. J. of Applied Mathematics and Computer Science, Vol.23, No.4, pp. 797-808, 2013.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Dec. 06, 2024