single-rb.php

JRM Vol.25 No.6 pp. 1000-1010
doi: 10.20965/jrm.2013.p1000
(2013)

Paper:

Design of Brain-Machine Interface Using Near-Infrared Spectroscopy

Tomotaka Ito*, Satoshi Ushii*, Takafumi Sameshima*,
Yoshihiro Mitsui*, Shohei Ohgi**, and Chihiro Mizuike**

*Department of Mechanical Engineering, Faculty of Engineering, Shizuoka University, 3-5-1 Johoku, Naka-ku, Hamamatsu, Shizuoka 432-8561, Japan

**Division of Physical Therapy, School of Rehabilitation Sciences, Seirei Christopher University, 3453 Mikatahara-cho, Kita-ku, Hamamatsu, Shizuoka 433-8558, Japan

Received:
May 10, 2013
Accepted:
November 3, 2013
Published:
December 20, 2013
Keywords:
brain-machine interface, near-infrared spectroscopy, pattern classification
Abstract
In recent years, the fields of robotics and medical science have been paying close attention to brainmachine interface (BMI) systems. BMI observes human cerebral activity and use the collected data as the input to various instruments. If such a systemcould be effectively realized, it could be used as a new intuitive input interface for application to human-robot interactions, welfare scenarios, etc. In this paper, we discussed a design problem related to a BMI system using near-infrared spectroscopy (NIRS). We developed a brain state classifier based on the learning vector quantization (LVQ) method. The proposed method classifies the cerebral blood flow patterns and outputs the brain state estimate. The classification experiments showed that the proposed method can successfully classify not only human physical motions and motor imageries, but also human emotions and human mental commands issued to a robot. Especially, in the classification of “the mental commands to a robot,” we successfully realized the imagery classification of five different mental commands. The results point to the potential of NIRS-based brain machine interfaces.
Cite this article as:
T. Ito, S. Ushii, T. Sameshima, Y. Mitsui, S. Ohgi, and C. Mizuike, “Design of Brain-Machine Interface Using Near-Infrared Spectroscopy,” J. Robot. Mechatron., Vol.25 No.6, pp. 1000-1010, 2013.
Data files:
References
  1. [1] J. R. Wolpaw, N. Birbaumer, D. J. McFarland, G. Pfurtscheller, and T. M. Vaughan, “Brain-Computer Interfaces for Communication and Control,” Clinical Neurophysiology, Vol.113, pp. 767-791, 2002.
  2. [2] R. Hasegawa, “Development and Future of Brain-Machine Interface,” J. of the Institute of Electronics, Information, and Communication Engineers, Vol.91, No.12, pp. 1066-1075, 2008 (in Japanese).
  3. [3] E. M. Schmidt, “Single Neuron Recording from Motor Cortex as a Possible Source of Signals for Control of External Devices,” Annals of Biomedical Engineering, Vol.8, No.4-6, pp. 339-349, 1980.
  4. [4] E. Margalit, J. D. Weiland et al., “Visual and electrical evoked response recorded from subdural electrodes implanted above the visual cortex in normal dogs under two methods of anesthesia,” J. of Neuroscience Methods, Vol.123, pp. 129-137, 2003.
  5. [5] H. Koizumi, A. Maki, T. Yamamoto, and Y. Yamamoto, “Observation for Mind and the Brain: Noninvasive Higher-order Brainfunction Imaging,” J. of the Institute of Electronics, Information, and Communication Eng., Vol.87, No.3, pp. 207-214, 2004 (in Japanese).
  6. [6] T. Kato, A. Kamei, S. Takashima, and T. Ozaki, “Human Visual Cortical Function During Photic Stimulation Monitoring by Means of Near-Infrared Spectroscopy,” J. of Cerebral Blood Flow and Metabolism, Vol.13, No.3, pp. 516-520, 1993.
  7. [7] Y. Yamashita, A. Maki, E. Watanabe, H. Koizumi, and F. Kawaguchi, “Development of Optical Topography for Noninvasive Measurement of Human Brain Activity,” MEDIX, Vol.29, pp. 36-40, 1998.
  8. [8] T. Amita, S. Tsuneishi, S. Kono, A. Ishikawa, and K. Shimizu, “Medical Applications of Near-Infrared Spectroscopy,” J. of Japan Society of Infrared Science and Technology, Vol.14, No.1, pp. 11-16, 2004.
  9. [9] M. D. Serruya, N. G. Hatsopouls, L. Paninski, M. R. Fellows, and J. P. Donoghue, “Instant Neural Control of a Movement Signal,” Nature, Vol.416, No.6877, pp. 141-142, 2002.
  10. [10] L. R. Hochberg, M. D. Serruya, G. M. Friehs, J. A. Mukand, M. Saleh, A. H. Caplan, A. Branner, D. Chen, R. D. Penn, and J. P. Donoghue, “Neuronal Ensemble Control of Prosthetic Devices by a Human with Tetraplegia,” Nature, Vol.442, No.7099, pp. 164-171, 2006.
  11. [11] M. Hirata, T. Yanagisawa, Y. Saitoh et al., “Real Time Neural Decoding for Motor Control based on the Electrocorticograms,” Proc. of 23th Symposium on Biological and Physiological Engineering, 2008 (in Japanese).
  12. [12] Y. Kamitani and F. Tong, “Decoding the Visual and Subjective Contents of the Human Brain,” Nature Neuroscience, Vol.8, pp. 679-685, 2005.
  13. [13] Q. Zhao, L. Zhang, and A. Cichocki, “EEG-based Asynchronous BCI Control of a Car in 3D Virtual Reality Environments,” Chinese Science Bulletin, Vol.54, No.1, pp. 78-87, 2009.
  14. [14] T. Okabe et al., “Brain Machine Interface Technology Enabling Control of Machines by Human Thought Alone,” Honda R&D Technical Review, Vol.22, No.2, pp. 91-98, 2010 (in Japanese).
  15. [15] K. Choi and A. Cichocki, “Control of a Wheelchair by Motor Imagery in Real Time,” Int. Conf. on Intelligent Data Engineering and Automated Learning, pp. 330-337, 2008.
  16. [16] H. H. Jasper, “The Ten-Twenty Electrode System of the International Federation, Electroencephalography and Clinical Neurophysiology,” No.10, pp. 371-375, 1958.
  17. [17] T. Kohonen, “Self-Organization and Associative Memory,” Third Edition, Springer-Verlag, 1989.
  18. [18] T. Ito, H. Akiyama et al., “Brain Machine Interface using Portable Near-InfraRed Spectroscopy – Improvement of Classification Performance based on ICA analysis and Self-proliferating LVQ,” IEEE Int. Conf. on Intelligent Robots and Systems, MoBT9.2, 2013.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Dec. 06, 2024