JACIII Vol.21 No.7 pp. 1312-1320
doi: 10.20965/jaciii.2017.p1312


Majority Rule Using Collaborative P300 by Auditory Stimulation

Kosuke Fujita and Hideaki Touyama

Department of Information Systems Engineering, Faculty of Engineering, Toyama Prefectural University
Imizu-City, Toyama 939-0398, Japan

April 20, 2017
October 11, 2017
November 20, 2017
collaborative EEG, P300, majority rule, auditory stimulus, attention, normalization, multimedia evaluation

In this study, a new method to realize majority rule is presented by using noninvasive brain activities. With the majority rule based on an electroencephalogram (EEG), a technique to determine the attention of multiple users is proposed. In general, a single-shot EEG ensures short-time response, but it is inevitably deteriorated by artifacts. To enhance the accuracy of the majority rule, the collaborative signals of P300 evoked potentials are focused. The collaborative P300 signal is prepared by averaging individual single-shot P300 signals among subjects. In experiments, the EEG signals of twelve volunteers were collected by using auditory stimuli. The subjects paid attention to target stimuli and no attention to standard stimuli. The collaborative P300 signal was used to evaluate the performance of the majority rule. The proposed algorithm enables us to estimate the degree of attention of the group. The classification is based on supervised machine learning, and the accuracy approximately 80%. The applications of this novel technique in multimedia content evaluations as well as neuromarketing and computer-supported co-operative work are discussed.

Cite this article as:
K. Fujita and H. Touyama, “Majority Rule Using Collaborative P300 by Auditory Stimulation,” J. Adv. Comput. Intell. Intell. Inform., Vol.21 No.7, pp. 1312-1320, 2017.
Data files:
  1. [1] J. R. Wolpow, N. Bribaumer, Dennis J. McFarland, G. Pfurtsyheller, and T. M. Vaughan, “Brain-computer interface for communication and control,” Clinical Neurophysiology, Vol.113, No.6, pp. 767-791, June 2002.
  2. [2] A. Dietrich and R. Kanso, “A Review of EEG, ERP, and Neuroimaging Studies of Creativity and Insight,” Psychological Bulletin, Vol.136, No.5, pp. 822-848, 2010
  3. [3] R. Johnshon Jr, “A Triarchic Model of P300 Amplitude,” Psychophysiology, Vol.23, pp. 367-384, 1985.
  4. [4] T. W. Picton, “The P300 Wave of the Human Event-Related Potential,” J. of Clinical Neurophysilogy, Vol.9, No.4, pp. 456-479, November 1992.
  5. [5] J. Polich, “Updating P300: An integrative theory of P3a and P3b,” Clinical Neurophysilogy, Vol.118, No.10, pp. 2128-2148, October 2007.
  6. [6] L. A. Farwell and E. Donchin, “Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials,” Electroencephalography and clinical Neurophysiology, Vol.70, No.6, pp. 510-523, 1988.
  7. [7] E. Douchin, K. M. Spencer, and R. Wijesinghe, “The Mental Prosthesis: Asessing the Speed of a P300-Based Brain-Computer Interface,” IEEE Transactions on Rehabilitation Engineering, Vol.8, No.2, pp. 174-179, June, 2000.
  8. [8] D. S. Klobassa, T. M. Vaughan, P. Brunner, N. E. Schwartz, J. R. Wolpow, C. Neuper and E. W. Sellers, “Toward a high-throughput auditory P300-based brain-computer interface,” Clin Neurophysiol, Vol.120, pp. 1252-1261, July 2009.
  9. [9] J. P. Rosenfeld, K. Bhat, A. Miltenberger, and M. Johnson, “Event-related potentials in the dual task paradigm: P300 discriminates engaging and non-engaging films when film-viewing is the primary task,” International Journal of Psychophysiology, Vol.12, pp. 221-232, May 1992.
  10. [10] Y. Shigemitsu and H. Nittono, “Assessing interest level during movie watching with brain potentials,” Proceedings of the Second International Workshop on Kansei, pp. 39-42, 2008.
  11. [11] J. Suzuki, H. Nittono, and T. Hori, “Level of interest in video clips modulates event-related potentials to auditory probes,” International Journal of Psychophysiology, Vol.55, pp. 35-43, January 2005.
  12. [12] Y. Wang and T. Jung, “A collaborative brain-computer interface for improving human performance,” PLos ONE, Vol.6, No.5, e20422, pp. 1-11, May 2011.
  13. [13] Y. Wang, Y. Wang, T. Jung, X. Gao, and S. Gao, “A Collaborative Brain-Computer Interface,” 4th International Conference on Biomedical Engineering and Informatics, pp. 583-586, 2011.
  14. [14] P. Yung, Y. Wang, W. Wu , H. Xu, X. Gao, and S. Gao, “Study on an Online Collaborative BCI to Accelerate Response to Visual Targets,” IEEE Engineering in Medicine and Biology Society, 2012 Annual Int. Conf. of the IEEE, pp. 1736-1739, March 2012.
  15. [15] J. Fan and H. Touyama, “Computer-Generated Emotional Face Retrieval with P300 Signals of Multiple Subjects,” Journal of Advanced Computational Intelligence and Intelligent Informatics, Vol.20, No.6, pp. 902-909, November 2016.
  16. [16] E. Naumann, C. Huber, S. Maier, W. Plihal, A. Wustmans, O. Diedrich, and D. Bartussek, “The scalp topography of P300 in the visual and auditory modalities: a comparison of three normalization methods and the control of statistical type II error,” Electroencephalography and Clinical Neurophysilogy, Vol.83, pp. 254-264, October 1992.
  17. [17] M. D. Comerchero and J. Polich, “P3a and P3b from typical auditory and visual stimuli,” Clinical Neurophysiology, Vol.110, pp. 24-30, January 1999.
  18. [18] G. H. Klem, H. O. Luders, H. H. Jasper, and C. Elqer, “The ten-twenty electrode system of the International Federation,” Guidelines of the International Federation of Clinical Neurophysiology, Vol.52, pp. 3-6, 1999.
  19. [19] A. Hyvarinen, “Fast and Robust Fixed-Point Algorithms for Independent Component Analysis,” IEEE Transactions on Neural Network, Vol.10, No.3, pp.626-634, May 1999.
  20. [20] D. S. Goodin, K. C. Squires, B. H. Henderson, and A. Starr, “Age-related variations in evoked potentials to auditory stimuli in normal human subjects,” Electroencephalography and Clinical Neurophysiology, Vol.44, pp. 447-458, April 1978.
  21. [21] R. T. Knight, “Aging Decreases Auditory Event-Related Potentials to Unexpected Stimuli in Humans,” Neurobiology of Aging, Vol.8, pp. 109-113, April 1987.
  22. [22] C. D. Woody, “Characterization of an adaptive filter for the analysis of variable latency neuroelectric signals,” Medical and Biological Engineering, Vol.5, pp. 539-553, November 1967.
  23. [23] A. D. Gerson, L. C. Parra, and P. Sajda, “Cortically-coupled computer vision for rapid image search,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, Vol.14, pp. 174-179, June 2006.
  24. [24] P. Sajda, E. Pohlmeyer, J. Wang, B. Hanna, Lucas C. Parra, and S. Chang, “Cortically-Coupled Computer Vision,” Human-Computer Interaction Series, pp. 133-148, 2010.
  25. [25] K. Schmidt and L. Bannon, “Taking CSCW Seriously: Supporting Articulation Work,” An International Journal, Vol.1, pp. 7-40, July 1992.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Jul. 12, 2024