JRM Vol.18 No.1 pp. 59-66
doi: 10.20965/jrm.2006.p0059


Identification and State Evaluation Based on Frontal Views of Walkers

Akio Nozawa*, Tota Mizuno*, Masafumi Uchida**,
Hisaya Tanaka***, and Hideto Ide*

*Department of Electronics and Electric Engineering, College of Science and Engineering, Aoyama Gakuin University, 5-10-1 Fuchinobe, Sagamihara, Kanagawa 229-8558, Japan

**Department of Electronic Engineering, The University of Electro-communications, 1-5-1 Chofu-Ga-Oka, Chofu, Tokyo 182-8585, Japan

***Kogakuin University, 1-24-2 Nishi-shinjuku, Shinjuku-ku, Tokyo 163-8677, Japan

May 27, 2005
November 7, 2005
February 20, 2006
walking movement, individual identification, pattern recognition, neural network
Walking, a basic human movement, is said to reflect an individual’s mental and physical state as well as individual characteristics. We discuss identifying walking gaits and evaluating mental and physical states using lower-limb trajectories extracted from frontal videos of walkers. We studied 3 mental and physical states - “normal,” “hurried,” and “tired” - using neural networks (NNs) were used for individual identification and state evaluation. From trajectories of the toes sampled from videos, we extracted static features (quantization) and dynamic features (displacement per unit time) and used these for teaching NNs. Experiments showed that trajectories showed features typical of individual walkers and their mental and physical states, with individual identification averaging 73% and evaluation 86.3%.
Cite this article as:
A. Nozawa, T. Mizuno, M. Uchida, H. Tanaka, and H. Ide, “Identification and State Evaluation Based on Frontal Views of Walkers,” J. Robot. Mechatron., Vol.18 No.1, pp. 59-66, 2006.
Data files:
  1. [1] M. Tanaka, M. Nakamura, T. Watanabe, and M. Kanamoto, “Analysis of Walking Motion,” Research bulletin of physical education, Vol.2, 2(7), pp. 111-118, 1979 (In Japanese).
  2. [2] J. Cutting, and L. Kozlowski, “Recognizing friends by their walk: Gait perception without familiarity cues,” Bulletin of the Psychonomic Society, Vol.9, pp. 353-356, 1977.
  3. [3] C. Barclay, J. Cutting, and L. Kozlowski, “Temporal and spatial factors in gait perception that influence gender recognition,” Perception Psychophysics, Vol.23, No.2, pp. 145-152, 1978.
  4. [4] H. Murase, “Recognizing Individuals from Silhouettes of Their Walk,” The Journal of The Institute of Electonics, Information and Communication Engineers, Vol.J75-D-II, No.6, pp. 1096-1098, 1992.
  5. [5] S. Maeda, M. Okamoto, T. Kawahara, M. Minoh, K. Ikeda, and S. Doshita, “Individual Identification by Integrating Facial Image, Walking Image and Vocal Feature,” IEICE Trans. on Information & Systems, Vol.79-D-II, No.4, pp. 600-607, 1996 (In Japanese).
  6. [6] K. Sudo, J. Yamato, A. Tomono, and K. Ishii, “Fusing Multiple Sensor Information for a Gender Determining System,” IEICE Trans. on Information & Systems, Vol.83-D-II, No.8, pp. 882-890, 2000 (In Japanese).
  7. [7] A. Fukuyama, M. Sawaki, H. Murase, and N. Hagita, “Age Estimation form Dynamic Factures of Gait,” The Journal of The Institute of Electonics, Information and Communication Engineers, Vol.J84-D-II, No.7, pp. 1522-1526, 2001.
  8. [8] Y. Wang, M. Yin, Y. Shibata, Y. Itasaka, W. H. Wong, and K. Ishikawa, “Gait Abnormalities Revealed by Irregularity of Foot Pressure Progression and Trajectories of Center of Force in Patients with Vestibular System Disorders,” Equilib Res, Vol.62, No.3, pp. 205-211, 2003.
  9. [9] K. Sato, Y. Mimura, H. Onodera, and T. Ishikawa, “Occlusion and Erect Bipedal Walk Part 1 Reconstruction of Occlusion and Erect Bipedal Walk,” Occlusion and Health, Vol.7, No.2, pp. 114-123, 2001.
  10. [10] H. Inooka, T. Ishihara, T. Ono, Y. Ohotaki, and R. Nagatomi, “Study on Stability Assessment of Elderly Walking Utilizing Portable,” DESCENTE SPORTS SCIENCE, Vol.24, pp. 61-67, 2003 (In Japanese).
  11. [11] T. Takeuchi, A. Nozawa, H. Tanaka, and H. Ide, “The Method of Visualizing Feeling of Pleasantness and Arousal Based on the Sensitive Matrix on Real Time,” IEEJ Trans. EIS, Vol.123-C, No.8, pp. 1512-1513, 2003 (In Japanese).
  12. [12] K. Nagumo, H. Zenju, A. Nozawa, H. Ide, and H. Tanaka, “Evaluation of Temporary Arousal Level using Thermogram Images,” 19th Sensing Forum, Vol.2, pp. 95-98, 2002 (In Japanese).
  13. [13] H. Zenju, K. Nagumo, A. Nozawa, H. Tanaka, and H. Ide, “The Estimation of Unpleasant and Pleasant States by Nasal Thermogram,” Forum on Information Technology, Vol.3, pp. 459-460, 2002 (In Japanese).
  14. [14] T. Horigome, H. Tanaka, and H. Ide, “Individual Identification by Spatial Frequency Feature of the Facial Skin Thermogram,” T.IEE JAPAN, Vol.122-C, No.9, pp. 1645-1650, 2002 (In Japanese).
  15. [15] A. Nozawa, and H. Ide, “Study on Individual and Gender Identification by Nasal Solid Model,” IEEJ Trans. EIS, Vol.124-C, No.6, pp. 1332-1333, 2004 (In Japanese).
  16. [16] K. Yabe, “Science of Fatigue and Stamina,” Kodansha, pp. 27-57, 1986 (In Japanese).

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Jul. 12, 2024