single-jc.php

JACIII Vol.15 No.5 pp. 573-581
doi: 10.20965/jaciii.2011.p0573
(2011)

Paper:

Data Mining Using Human Motions for Intelligent Systems

Yihsin Ho, Tomomi Shibano, Eri Sato-Shimokawara,
and Toru Yamaguchi

Graduate School of System Design, Tokyo Metropolitan University, 6-6 Asahigaoka, Hino, Tokyo 191-0065, Japan

Received:
November 22, 2010
Accepted:
April 14, 2011
Published:
July 20, 2011
Keywords:
Kukanchi, human motion, data mining, human intention, intelligent system
Abstract
In recent years, robot systems have come to play an important assisting role in providing information to human beings. The great progress made in Internet technology is also greatly beneficial as a means of collecting information. However, amassing too much information also gives rise to the problem about how to offer the most suitable data. In this paper, the authors present a system that can use users’ intentions to provide information to them. We focus on applying a series of “Time logs” constructed by using explicit human motions and motion time to provide base data for data mining, using the datamining to ascertain human intentions from the Time logs series, and then offering information to people. To achieve our research targets, we constructed our systembased on the Kukanchi concept, which allows every part of a system to provide and receive data. Moreover, our system uses cameras to collect human motion data without any need to place devices on users’ bodies. Our work verifies the possibility of using the Kukanchi concept to construct a system and the ability of such a system to apply human motions as mining data without disturbing users in any way. It also demonstrates the possibility of using a series of Time logs to detect human intentions so as to offer useful data to users.
Cite this article as:
Y. Ho, T. Shibano, E. Sato-Shimokawara, and T. Yamaguchi, “Data Mining Using Human Motions for Intelligent Systems,” J. Adv. Comput. Intell. Intell. Inform., Vol.15 No.5, pp. 573-581, 2011.
Data files:
References
  1. [1] Y. Nakauchi, T. Fukuda, K. Noguchi, and T. Matsubara, “Time Sequence Data Mining for Cooking Support Robot,” IEEE Int. Symposium on Computational Intelligence in Robotics and Automation, CIRA, pp. 481-486, 2005.
  2. [2] T. Fukuda, Y. Nakauchi, K. Noguchi, and T. Matsubara, “Cooking Procedure Support System by Using Autonomous Mobile Robot and Touch Panel,” Nihon Kikai Gakkai Ronbunshu, C Hen/Trans. of the Japan Society of Mechanical Engineers, Part C Vol.72, Issue 4, pp. 1215-1222, 2006.
  3. [3] C.Morita, M. Sato, and M. Doi, “Human Action Recognition Using Acceleration and Physiological Data in Real-Time,” IASTED Int. Conf. Applied Informatics, Article No.411-120, pp. 7-13, 2004.
  4. [4] L. Palafox and H. Hashimoto, “Human Action Recognition Using 4W1H and Particle Swarm Optimization Clustering,” 3rd Int. Conf. on Human System Interaction, HSI’2010 – Conf. Proc. 2010, Article No.5514542, pp. 369-373, 2010.
  5. [5] C. C. Tseng and D. Cook, “Mining from Time Series Human Movement Data,” IEEE Int. Conf. on Systems, Man and Cybernetics, Vol.4, Article No.4274381, pp. 3241-3243, 2007.
  6. [6] X. Wang, A. Wirth, and L. Wang, “Structure-Based Statistical Features and Multivariate Time Series Clustering,” IEEE Int. Conf. on Data Mining, ICDM, Article No.4470259, pp. 351-360, 2007.
  7. [7] K. Shimokura, H. Tezuka, N. Katafuchi, T. Machino, Y. Nanjo, and S. Iwaki, “Toward the Realization of Networked Robot – Study of Information Distribution Platform Architecture for Robot –,” Technical report of IEICE, The Institute of Electronics Information and Communication Engineers.
  8. [8] E. Sato, T. Yamaguchi, and F. Harashima, “Natural Interface Using Pointing Behavior for Human-Robot Gestural Interaction,” IEEE Trans. on Industrial Electronics, Vol.54, No.2, pp. 1105-1112, 2007.
  9. [9] T. Okada and H. Motoda, “Association Rules and its Surrounding,” The Operations Research Society of Japan.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 19, 2024