single-jc.php

JACIII Vol.20 No.6 pp. 902-909
doi: 10.20965/jaciii.2016.p0902
(2016)

Paper:

Computer-Generated Emotional Face Retrieval with P300 Signals of Multiple Subjects

Junwei Fan* and Hideaki Touyama**

*Information Systems Engineering, Graduate School of Engineering, Toyama Prefectural University
Imizu-City, Toyama, Japan

**Faculty of Engineering, Toyama Prefectural University
Imizu-City, Toyama, Japan

Received:
March 19, 2016
Accepted:
July 20, 2016
Online released:
November 20, 2016
Published:
November 20, 2016
Keywords:
event-related potential P300, brain-machine interface, multiple subjects, collaborative, computer-supported cooperative work
Abstract

Applying brain signals to human-computer interaction enables us to detect the attention. Based on P300 signals – one type of event-related potential – enables brain-machine interface users to select desired letters by means of attention alone. Previous studies have reported the feasibility of P300 signals in enabling a single subject to realize novel information retrieval. In the recent collaborative EEG study of multiple subjects has enabled classification to detect attention in a markedly improved way. Here we propose emotional face retrieval using P300 signals of 20 subjects. As a result, the F-measure under the condition of a single subject was a standard deviation of 0.636 ± 0.05 and an F-measure of 0.886 with multiple subjects. In short, emotional face retrieval classification is improved with collaborative P300 signals from multiple subjects. This technique could be applied to life logs, computer-supported cooperative work, and neuromarketing.

References
  1. [1] L. A. Farwell and E. Donchin, “Talking off the top of your head: Toward a mental prosthesis utilizing event-related brain potentials,” Electromyography and Clinical Neurophysiology, Vol.70, No.6, pp. 510-523, 1988.
  2. [2] J. R. Wolpaw, N. Birbaumer, D.J. McFarland, G. Pfurtscheller, and T.M. Vaughan, “Brain-computer interfaces for communication and control,” Clinical Neurophysiology, Vol.113, No.6, pp. 767-791, June 2002.
  3. [3] M. Cheng, X. Gao, S. Gao, and D. Xu, “Design and implementation of a brain-computer interface with high transfer rates,” IEEE Trans. on Biomedical Engineering, Vol.49, No.10, pp. 1181-1186, 2002.
  4. [4] G. Pfurtscheller, R. Leeb, C. Keinrath, D. Friedman, C. Neuper, C. Guger, and M. Slater, “Walking from thought,” Brain Research, Vol.1071, No.1, pp. 145-152, 2006.
  5. [5] A. Dietrich, R. Kanso, “A Review of EEG, ERP, and Neuroimaging Studies of Creativity and Insight,” Psychological bulletin, Vol.136, No.5, pp. 822-848, March 2010.
  6. [6] T. W. Picton, “The P300 Wave of the Human Event-Related Potential,” J. of Clinical Neurophysiology, Vol.9, No.4, pp. 456-479, November 1992.
  7. [7] J. Polich, “Updating P300: An Integrative Theory of P3a and P3b,” Clinical Neurophysiology, Vol.118, No.10, pp. 2128-2148, October 2007.
  8. [8] D. J. Krusienski, E. W. Sellers, D. J. McFarland, T. M. Vaughan, and J. R. Wolpaw, “Toward enhanced P300 speller performance,” J. of Neuroscience Methods, Vol.167, No.1, pp. 15-21, January 2008.
  9. [9] C. Guger, S. Daban, E. Sellers, C. Holzner, G. Krausz, R. Carabalona, F. Gramatica, and G. Edlinger, “How many people are able to control a P300-based brain-computer interface (BCI)?,” Neuroscience Letters, Vol.462, No.1, pp. 94-98, September 2009.
  10. [10] H. Cecotti and A. Graser, “Convolutional Neural Networks for P300 Detection with Application to Brain-Computer Interfaces,” IEEE Trans. on Pattern Analysis and Machine Intelligence, Vol.33, No.3, 2011.
  11. [11] Y. Wang, T. Jung, “A Collaborative Brain-Computer Interface for Improving Human Performance,” PLoS ONE, Vol.6, No.5, e20422, May 2011.
  12. [12] P. Yuan, Y. Wang, W. Wu, H. Xu, X. Gao, and S. Gao, “Study on an Online Collaborative BCI to Accelerate Response to Visual Targets,” IEEE Engineering in Medicine and Biology Society, 2012 Annual Int. Conf. of the IEEE, pp. 1736-1739, March 2012.
  13. [13] R. Poli, D. Valeriani, and C. Cinel, “Collaborative Brain-Computer Interface for Aiding Decision-Making,” PLoS ONE, Vol.9, No.7, e102693, July 2014.
  14. [14] A. D. Gerson, L. C. Parra, and P. Sajda, “Cortically Coupled Computer Vision for Rapid Image Search,” IEEE Trans. on Neural Systems and Rehabilitation Engineering, Vol.14, No.2, pp. 174-179, June 2006.
  15. [15] A. R. Doherty and A. F. Smeaton, “Combining Face Detection and Novelty to Identify Important Events in a Visual Lifelog,” IEEE 8th Int. Conf. on Computer and Information Technology Workshops, pp. 348-353, 2008.
  16. [16] U. Dimberg, M. Thunberg, and K. Elmehed, “Unconscious facial reactions to emotional facial expressions,” Psychological Science, Vol.11, No.1, pp. 86-89, 2000.
  17. [17] P. Winkielman and J. T. Cacioppo, “Mind at Ease Puts a Smile on the Face: Psychophysiological Evidence That Processing Facilitation Elicits Positive Affect,” J. of Personality and Social Psychology, Vol.81, No.6, pp. 989-1000, 2001.
  18. [18] T. Tsukiura and R. Cabeza, “Orbitofrontal and hippocampal contributions to memory for face-name associations: The rewarding power of a smile,” Neuropsychologia, Vol.46, No.9, pp. 2310-2319, 2008.
  19. [19] A. Muhlberger, M. J. Wieser, M. J. Herrmann, P. Weyers, C. Troger, and P. Pauli, “Early cortical processing of natural and artificial emotional faces differs between lower and higher socially anxious persons,” J. of Neural Transmission, Vol.116, No.6, pp. 735-746, 2009.
  20. [20] R. Johnson, “On the neural generators of the P300 component of the event-related potential,” Psychophysiology, Vol.30, pp. 90-97, 1993.
  21. [21] A. Hyvarinen, “Fast and Robust Fixed-Point Algorithms for Independent Component Analysis,” IEEE Trans. on Neural Networks, Vol.10, No.3, pp. 626-634, May 1999.
  22. [22] J. Kayser, and C. E. Tenke, “Principal components analysis of Laplacian waveforms as a generic method for identifying ERP generator patterns: I. Evaluation with auditory oddball tasks,” Clinical Neurophysiology, Vol.117, pp. 348-368, 2006.
  23. [23] D. Foti, G. Hajcak, and J. Dien, “Differentiating neural responses to emotional pictures: Evidence from temporal-spatial PCA,” Psychophysiology, Vol.46, pp. 521-530, 2009.
  24. [24] M. Thulasidas, C. Guan, and J. Wu, “Robust Classification of EEG Signal for Brain-Computer Interface,” IEEE Trans. on Neural Systems and Rehabilitation Engineering, Vol.14, No.1, pp. 24-29, 2006.
  25. [25] F. Lotte, M. Congedo, A. Lecuyer, F. Lamarche, and B. Arnaldi, “A review of classification algorithms for EEG-based brain-computer interfaces,” J. of Neural Engineering, Vol.4, No.2, pp. 24, 2007.
  26. [26] K. M. Spencer, J. Dien, and E. Donchin, “A componential analysis of the ERP elicited by novel events using a dense electrode array,” Psychophysiology, Vol.36, No.03, pp. 409-414, 1999.
  27. [27] B. Blankertz, S. Lemm, M. Treder, S. Haufe, and K. Muller, “Single-trial analysis and classification of ERP components – A tutorial,” NeuroImage, Vol.56, pp. 814-825, 2011.
  28. [28] Y. Arbel, K. Goforth, and E. Donchin, “The Good, the Bad, or the Useful? The Examination of the Relationship between the Feedback-related Negativity (FRN) and Long-term Learning Outcomes,” J. of Cognitive Neuroscience, Vol.25, No.8, pp. 1249-1260, 2013.
  29. [29] H. Touyama, M. Hirose, “EEG-Based Photo Pickup,” Proc. of 18th Int. Conf. on Artificial Reality and Telexistence, pp. 277-280, 2008.
  30. [30] D. Vernon, T. Egner, N. Cooper, T. Compton, C. Neilands, A. Sheri, and J. Gruzelier, “The effect of training distinct neurofeedback protocols on aspects of cognitive performance,” Int. J. of Psychophysiology, Vol.47, No.1, pp. 75-85, 2003.
  31. [31] P. Martinez, H. Bakardjian, and A. Cichocki, “Research Article Fully Online Multicommand Brain-Computer Interface with Visual Neurofeedback Using SSVEP Paradigm,” Computational Intelligence and Neuroscience, Vol.2007, pp. 9, 2007.
  32. [32] H. Hwang, K. Kwon, and C. Im, “Neurofeedback-based motor imagery training for brain-computer interface (BCI),” J. of Neuroscience Methods, Vol.179, pp. 150-156, 2009.
  33. [33] N. LEE, A. J. Broderick, and L. Chamberlain, “What is ‘neuromarketing’? A discussion and agenda for future research,” Int. J. of Psychophysiology, Vol.63, No.2, pp. 199-204, 2007.
  34. [34] H. T. Schupp, B. N. Cuthbert, M. M. Bradley, J. T. Cacioppo, T. Ito, and P.J. Lang, “Affective picture processing: The late positive potential is modulated by motivational relevance,” Psychophysiology, Vol.37, No.2, pp. 257-261, 2000.
  35. [35] H. T. Schupp, T. Flaisch, J. Stockburger, and M. Junghofer, “Emotion and attention: event-related brain potential studies,” Progress in Brain Research, Vol.156, pp. 31-51, 2006.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, IE9,10,11, Opera.

Last updated on Mar. 28, 2017