single-jc.php

JACIII Vol.22 No.5 pp. 711-717
doi: 10.20965/jaciii.2018.p0711
(2018)

Paper:

A Comparative Sensor Based Multi-Classes Neural Network Classifications for Human Activity Recognition

Ramtin Aminpour and Elmer Dadios

De La Salle University
2401 Taft Avenue, Manila 1004, Philippines

Received:
March 16, 2018
Accepted:
June 16, 2018
Published:
September 20, 2018
Keywords:
neural network, classification, human activity recognition, LVQ, GMDH
Abstract

Human activity recognition with the smartphone could be important for many applications, especially since most of the people use this device in their daily life. A smartphone is a portable gadget with internal sensors and enough hardware power to accommodate this problem. In this paper, three neural network algorithms were compared to detect six major activities. The data are collected by a smartphone in real life and simulated on the remote server. The results show that MLP and GMDH neural network have better accuracy and performance compared with the LVQ neural network algorithm.

Cite this article as:
R. Aminpour and E. Dadios, “A Comparative Sensor Based Multi-Classes Neural Network Classifications for Human Activity Recognition,” J. Adv. Comput. Intell. Intell. Inform., Vol.22, No.5, pp. 711-717, 2018.
Data files:
References
  1. [1] E. Kim and S. Helal, “Modeling human activity semantics for improved recognition performance,” Int. Conf. Ubiquitous Intell. Comput., pp. 514-528. Springer, 2011.
  2. [2] S. Zhang, G. Liu, and X. Lai, “Classification of Evoked Emotions Using an Artificial Neural Network Based on Single, Short-Term Physiological Signals,” J. Adv. Comput. Intell. Intell. Inform., Vol.19, No.1, pp. 118-126, 2015.
  3. [3] P. Theekakul, S. Thiemjarus, E. Nantajeewarawat, T. Supnithi, and K. Hirota, “A rule-based approach to activity recognition,” Knowledge, Information, Creat. Support Syst., pp. 204-215, Springer, 2011.
  4. [4] M. Zhang and A. A. Sawchuk, “A bag-of-features-based framework for human activity representation and recognition,” Proc. 2011 Int. Work. Situat. Act. goal Aware., pp. 51-56. ACM, 2011.
  5. [5] A. M. Khan, Y.-K. Lee, S. Y. Lee, and T.-S. Kim, “A triaxial accelerometer-based physical-activity recognition via augmented-signal features and a hierarchical recognizer,” IEEE Trans. Inf. Technol. Biomed., Vol.14, No.5, pp. 1166-1172, 2010.
  6. [6] P. Casale, O. Pujol, and P. Radeva, “Human activity recognition from accelerometer data using a wearable device,” Pattern Recognit. Image Anal., pp. 289-296, Springer, 2011.
  7. [7] S. J. Preece, J. Y. Goulermas, L. P. J. Kenney, and D. Howard, “A comparison of feature extraction methods for the classification of dynamic activities from accelerometer data,” Biomed. Eng. IEEE Trans., Vol.56, No.3, pp. 871-879, 2009.
  8. [8] N. D. Singpurwalla and J. M. Booker, “Membership functions and probability measures of fuzzy sets,” J. Am. Stat. Assoc., Vol.99, No.467, pp. 867-877, 2004.
  9. [9] L. Wang, W. Hu, and T. Tan, “Recent developments in human motion analysis,” Pattern Recognit., Vol.36, No.3, pp. 585-601, 2003.
  10. [10] D. M. Karantonis, M. R. Narayanan, M. Mathie, N. H. Lovell, and B. G. Celler, “Implementation of a real-time human movement classifier using a triaxial accelerometer for ambulatory monitoring,” Inf. Technol. Biomed. IEEE Trans., Vol.10, No.1, pp. 156-167, 2006.
  11. [11] N. Ravi, N. Dandekar, P. Mysore, and M. L. Littman, “Activity recognition from accelerometer data,” Proc. of the 17th Conf. on Innovative Applications of Artificial Intelligence, Vol. 5, pp. 1541-1546, 2005.
  12. [12] T. Huynh and B. Schiele, “Analyzing features for activity recognition,” Proc. 2005 Jt. Conf. Smart objects Ambient Intell. Innov. Context. Serv. usages Technol., pp. 159-163, ACM, 2005.
  13. [13] T. Stiefmeier, D. Roggen, and G. Tröster, “Gestures are strings: efficient online gesture spotting and classification using string matching,” Proc. Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering (ICST) 2nd Int. Conf. Body area networks, p. 16, 2007.
  14. [14] C. BenAbdelkader, R. Cutler, and L. Davis, “Stride and cadence as a biometric in automatic person identification and verification,” Proc. 5th IEEE Int. Conf. on Autom. Face Gesture Recognition, pp. 372-377, 2002.
  15. [15] M. Fahim, I. Fatima, S. Lee, and Y.-T. Park, “EFM: evolutionary fuzzy model for dynamic activities recognition using a smartphone accelerometer,” Appl. Intell., Vol.39, No.3, pp. 475-488, 2013.
  16. [16] T.-P. Kao, C.-W. Lin, and J.-S. Wang, “Development of a portable activity detector for daily activity recognition,” IEEE Int. Symp. on Ind. Electron (ISIE), pp. 115-120, 2009.
  17. [17] Z. Zhang, “Microsoft kinect sensor and its effect,” MultiMedia, IEEE, Vol.19, No.2, pp. 4-10, 2012.
  18. [18] A. Dutta, O. Ma, M. P. Buman, and D. W. Bliss, “Learning approach for classification of GENEActiv accelerometer data for unique activity identification,” IEEE 13th Int. Conf. on Wearable Implant. Body Sens. Networks (BSN), pp. 359-364, 2016.
  19. [19] E. P. Dadios and D. J. Williams, “Application of neural networks to the flexible pole-cart balancing problem,” IEEE Int. Conf. on Syst. Man Cybern. Intell. Syst. 21st Century., Vol.3, pp. 2506-2511, 1995.
  20. [20] J.-S. R. Jang, C.-T. Sun, and E. Mizutani, “Neuro-fuzzy and soft computing: a computational approach to learning and machine intelligence,” Prentice Hall, 1997.
  21. [21] A. G. Ivakhnenko and G. A. Ivakhnenko, “Problems of further development of the group method of data handling algorithms. Part I,” PATTERN Recognit. IMAGE Anal. C/C RASPOZNAVANIYE Obraz. I Anal. Izobr., Vol.10, No.2, pp. 187-194, 2000.
  22. [22] J. Xiao, H. Cao, X. Jiang, X. Gu, and L. Xie, “GMDH-based semi-supervised feature selection for customer classification,” Knowledge-Based Syst., 2017.
  23. [23] S. J. Farlow, ”Self-organizing methods in modeling: GMDH type algorithms,” volume 54, CrC Press, 1984.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, IE9,10,11, Opera.

Last updated on Oct. 23, 2018