Paper:
Online Control of a Virtual Object with Collaborative SSVEP
Hideaki Touyama and Mitsuru Sakuda
Toyama Prefectural University
5180 Kurokawa, Imizu-city, Toyama 939-0398, Japan
In this paper, we propose a brain-computer interface (BCI) based on collaborative steady-state visually evoked potential (SSVEP). A technique for estimating the common direction of the gaze of multiple subjects is studied with a view to controlling a virtual object in a virtual environment. The electro-encephalograms (EEG) of eight volunteers are simultaneously recorded with two virtual cubes as visual stimuli. These two virtual cubes flicker at different rates, 6 Hz and 8 Hz, and the corresponding SSVEP is observed around the occipital area. The amplitude spectra of the EEG activity of individual subjects are analyzed, averaged, and synthesized to obtain the collaborative SSVEP. Machine learning is applied to estimate the common gaze direction of the eight subjects with the supervised data from fewer than eight subjects. The estimation accuracy is perfect only in the case of the collaborative SSVEP. One-dimensional control of a virtual ball is performed by controlling the common eye gaze direction, which induces the collaborative SSVEP.
- [1] J. R. Wolpaw, N. Birbaumer, D. J. McFarland, G. Pfurtscheller, and T. M. Vaughan, “Brain-computer interfaces for communication and control,” Clinical Neurophysiology, Vol.113, pp. 767-791, 2002.
- [2] G. Pfurtscheller, R. Leeb, C. Keinrath, D. Friedman, C. Neuper, C. Guger, and M. Slater, “Walking from thought,” Brain Research, Vol.1071, pp. 145-152, 2006.
- [3] L. A. Farewell and E. Donchin, “Taking off the top of your head: toward a mental prothesis utilizing event-related brain potentials,” Electroenceph Clin Neurophysiol, Vol.70, pp. 510-523, 1988.
- [4] J. J. Vidal, “Towards direct brain-computer communication,” Annu. Rev. Biophys. Bioeng, Vol.2, pp. 157-180, 1973.
- [5] M. Middendorf, G. McMillan, G. Calhoun, and K. S. Jones, “Brain-Computer Interfaces Based on the Steady-State Visual-Evoked Response,” IEEE Trans. on Rehabilitation Engineering, Vol.8, No.2, pp. 211-214, 2000.
- [6] M. Cheng, X. Gao, S. Gao, and D. Xu, “Design and Implementation of a Brain-Computer Interface With High Transfer Rates,” IEEE Trans. on Biomedical Engineering, Vol.49, No.10, pp. 1181-1186, 2002.
- [7] H. Touyama and M. Hirose, “Steady-state VEPs in CAVE for walking around the virtual world,” Proc. of 12th Int. Conf. on Human-Computer Interaction, LNCS 4555, pp. 715-717, 2007.
- [8] M. Hirose, T. Ogi, S. Ishiwata, and T. Yamada, “Development and evaluation of the CABIN immersing multiscreen display,” Systems and Computers in Japan, Vol.30, No.1, pp. 13-22, 1999.
- [9] G. Pfurtscheller and C. Neuper, “Motor imagery activates primary sensorimotor area in man,” Neurosci Lett., Vol.239, pp. 65-68, 1997.
- [10] Y. Wang and T. P. Jung, “A Collaborative Brain-Computer Interface for Improving Human Performance,” PLoS ONE, Vol.6, No.5, e20422, 2011.
- [11] J. Fan and H. Touyama, “Computer-Generated Emotional Face Retrieval with P300 Signals of Multiple Subjects,” J. Adv. Comput. Intell. Intell. Inform. (JACIII), Vol.20, No.6, pp. 902-909, 2016.
- [12] Y. Kuroiwa and G. G. Celesia, “Visual Evoked Potentials with hemifield pattern stimulation, Their use in the diagnosis of retrochiasmatic lesions,” Arch. Neurol., Vol.38, pp. 86-90, 1981.
- [13] Unity, https://unity3d.com/unity [accessed Nov. 2, 2017]
- [14] G. H. Klem, H. O. Lüders, H. H. Jasper, and C. Elger, “The ten-twenty electrode system of the International Federation,” Electroenceph Clin Neurophysiol, Supplement 52, pp. 3-6, 1999.
- [15] H. Touyama and J. Fan, “Decision by majority by thinking,” Trans. of the Virtual Reality Society of Japan, Vol.22, No.1, pp. 27-30, 2017.
This article is published under a Creative Commons Attribution-NoDerivatives 4.0 Internationa License.