single-rb.php

JRM Vol.24 No.6 pp. 1063-1070
doi: 10.20965/jrm.2012.p1063
(2012)

Paper:

Recognition of Face Orientations Based on Nostril Feature

Nobuaki Nakazawa*, Takashi Mori*, Aya Maeda*,
Il-Hwan Kim**, Toshikazu Matsui*, and Kou Yamada*

*Graduate School of Engineering, Gunma University, 29-1 Hon-cho, Oota, Gunma 373-0057, Japan

**Department of Electrical and Electronic Engineering, Kangwon National University, 1 Kangwondaehak-gil, Chuncheon-si, Gangwon-do 200-701, Korea

Received:
October 5, 2011
Accepted:
October 22, 2012
Published:
December 20, 2012
Keywords:
human interface, operation, auto-wheelchair
Abstract
This paper describes a noncontactman-machine interface based on face orientation. Real-time images of an operator’s face were observed by a USB camera and changes in the dark area of the nostrils were utilized for the recognition of face orientation. When the operator faced up, dark areas of both nostrils increased in area, and when the operator faced down, such dark areas decreased, respectively. In contrast, the difference between nostril areas could be caused when the face was turn to the side. Here, these characteristics were reflected in face-orientation recognition. The interface we developed was applied to electrical wheelchair operation.
Cite this article as:
N. Nakazawa, T. Mori, A. Maeda, I. Kim, T. Matsui, and K. Yamada, “Recognition of Face Orientations Based on Nostril Feature,” J. Robot. Mechatron., Vol.24 No.6, pp. 1063-1070, 2012.
Data files:
References
  1. [1] T. Ochiai et al., “Data Input Device for Physically Handicapped People Operated by Eye Movements,” Trans. of the Japan Society of Mechanical Engineers, Series C, Vol.63, No.609, pp. 1546-1550, 1997.
  2. [2] T. Tsuji et al., “An EMG Controlled Pointing Device Using a Neural Network,” Trans. of the Society of Instrument and Control Engineers, Vol.37, No.5, pp. 425-433, 2001.
  3. [3] T. Hatakeyama et al., “Pointing Device Incorporating the Automatic Location Compensation Feature for People with Severe Physical Disability,” J. of Human Interface Society, Vol.1, No.3, pp. 13-20, 1999.
  4. [4] N. Nakazawa et al., “Development of Welfare Support-Equipment for Personal Computer Operation with Head Tilting and Breathing,” Proc. of the 31st Annual Conf. of the IEEE Industrial Electronics Society, pp. 201-206, Nov. 2005.
  5. [5] M. Hashimoto et al., “Application of the Teeth Chattering Sound and Humming to the Alternative Switch of a Pointing Device,” Trans. of the Institute of Electronics, Information and Communication Engineers, Vol.J84-D-1, No.7, pp. 1112-1116, 2001.
  6. [6] Y. Ichinose et al., “Human Interface Using a Wireless Tongue-Palate Contact Pressure Sensor System and Its Application to the Control of an Electric Wheelchair,” Trans. of the Institute of Electronics, Information and Communication Engineers, Vol.J86-D-II, No.2, pp. 364-367, 2003.
  7. [7] J. Yang et al., “Two-Dimensional PCA: A New Approach to Appearance-Based Face Representation and Recognition,” IEEE Trans. on Pattern Analysis and Machine Intelligence, Vol.26, No.1, pp. 131-137, 2004.
  8. [8] H. Ohta et al., “A Method of 3-D Measurement of Facial Feature points using the HeadMotion,” Trans. of the Institute of Electronics, Technical Report of ICICE, DSP2000-118, pp. 5-12, 2000.
  9. [9] Y. Adachi et al., “Intelligent Wheelchair Using Visual Information from the Human Face,” J. of the Robotic Society of Japan, Vol.17, No.3, pp. 423-431, 1999.
  10. [10] T. F. Cootes et al., “Active Shape Models – Their Training and application,” Computer Vision and Image Understanding, Vol.61, No.1, pp. 38-59, 1995.
  11. [11] F. Dornaika et al., “Fast and Reliable Active Appearance Model Search for 3D Face Tracking,” Proc. of Mirage, pp. 113-122, 2003.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 05, 2024