JACIII Vol.16 No.1 pp. 13-23
doi: 10.20965/jaciii.2012.p0013


Similarity Retrieval of Motion Capture Data Based on Derivative Features

Worawat Choensawat*, Woong Choi**,
and Kozaburo Hachimura*

*School of Science and Engineering, Ritsumeikan University, 1-1-1 Noji Higashi, Kusatsu, Shiga 525-8577, Japan

**Department of Information and Computer Engineering, Gunma National College of Technology, 580 Tobamachi, Maebashi, Gunma 371-8530, Japan

July 23, 2011
November 25, 2011
January 20, 2012
motion capture, content based retrieval, dynamic time wrapping, derivative feature
In this paper, we propose (1) a method of similarity retrieval of motion capture data in which a new feature extraction technique is introduced for the improvement of similarity search precision, as well as (2) a method to reduce the search time on a large database by using lower bound Dynamic Time Wrapping (DTW). For similarity search, joint speed has been mainly used as features of a particular motion. Our method differs from others in that we use not only the magnitude of speed but also the pattern of speed change. We measure the pattern of changing joint speed in a short period of time with the derivative of joint speed. In our experiments, we found that our proposed feature extraction can improve search precision and time. The average precision was greater than 90% and its computation time was 10 seconds on a dataset of 225 motion clips with a total of 81,851 frames from CMU’s database. The experiments showed that we can improve search precision using our proposed feature extraction technique compared to the retrieval method without using this method. For search time, our experiment shows that our retrieval method using the lower bound DTWcan efficiently reduce the amount of search data.
Cite this article as:
W. Choensawat, W. Choi, and K. Hachimura, “Similarity Retrieval of Motion Capture Data Based on Derivative Features,” J. Adv. Comput. Intell. Intell. Inform., Vol.16 No.1, pp. 13-23, 2012.
Data files:
  1. [1] CMU, “MoCap Database,” 2003.
  2. [2] A. Bruderlin and L. Williams, “Motion signal processing,” In SIGGRAPH’95: Proc. of the 22nd Annual Conf. on Computer Graphics and Interactive Techniques, pp. 97-104, 1995.
  3. [3] T. Yu, X. Shen, Q. Li, and W. Geng, “Motion retrieval based on movement notation language: Motion Capture and Retrieval,” Computer Animation. Virtual Worlds, Vol.16, pp. 273-282, July 2005.
  4. [4] S.-W. Kim, S. Park, and W. W. Chu, “An Index-Based Approach for Similarity Search Supporting Time Warping in Large Sequence Databases,” In Proc. of the ICDE 2001, pp. 607-614, 2001.
  5. [5] M. Müller, T. Röder, and M. Clausen, “Efficient content-based retrieval of motion capture data,” ACM Trans. on Graphics, Vol.24, No.3, pp. 677-685, 2005.
  6. [6] B. Demuth, T. Röder, M. Müller, and B. Eberhardt, “An Information Retrieval System for Motion Capture Data,” In Proc. of the 28th European Conference on Information Retrieval (ECIR), Vol.3936 of LNCS, pp. 373-384. Springer, 2006.
  7. [7] K. Forbes and E. Fiume, “An efficient search algorithm for motion data using weighted PCA,” In Proc. of the 2005 ACM SIGGRAPH/Eurographics symposium on Computer animation, SCA’05, pp. 67-76. ACM, 2005.
  8. [8] L. Kovar andM. Gleicher, “Automated extraction and parameterization of motions in large data sets,” ACM Trans. on Graphics, Vol.23, pp. 559-568, August 2004.
  9. [9] B. Krüger, J. Tautges, A. Weber, and A. Zinke, “Fast local and global similarity searches in large motion capture databases,” In Proc. of the 2010 ACM SIGGRAPH/Eurographics Symposium on Computer Animation, pp. 1-10. Eurographics Association, 2010.
  10. [10] K. Onuma, C. Faloutsos, and J. K. Hodgins, “FMDistance: A fast and effective distance function for motion capture data,” In Short Papers Proc. of EUROGRAPHICS, 2008.
  11. [11] Y. Sakamoto, S. Kuriyama, and T. Kaneko, “Motion map: imagebased retrieval and segmentation of motion data,” In Proc. of the 2004 ACM SIGGRAPH/Eurographics symposium on Computer animation, SCA’04, pp. 259-266, Aire-la-Ville, Switzerland, Switzerland, 2004.
  12. [12] L. Kovar, J. Schreiner, and M. Gleicher, “Footskate cleanup for motion capture editing,” In Proc. of the 2002 ACM SIGGRAPH/Eurographics symposium on Computer animation, pp. 97-104, ACM, 2002.
  13. [13] J. Lee, J. Chai, P. Reitsma, J. Hodgins, and N. Pollard, “Interactive control of avatars animated with human motion data,” ACM Trans. on Graphics, Vol.21, No.3, pp. 491-500, 2002.
  14. [14] J. Wang and B. Bodenheimer, “Synthesis and evaluation of linear motion transitions,” ACM Trans. on Graphics (TOG), Vol.27, No.1 p. 1, 2008.
  15. [15] J. Barbič, A. Safonova, J.-Y. Pan, C. Faloutsos, J. K. Hodgins, and N. S. Pollard, “Segmenting motion capture data into distinct behaviors,” In Proc. of the Graphics Interface, GI’04, pp. 185-194, 2004.
  16. [16] G. Johansson, “Visual perception of biological motion and a model for its analysis,” Perception and Psychophysics, No.2, 1973.
  17. [17] J. A. Rice, “Mathematical Statistics and Data Analysis,” Duxbury Advanced Series, 3rd edition, 2007.
  18. [18] J. Aach and G. M. Church, “Aligning gene expression time series with time warping algorithm,” Bioinformatics, pp. 495-508, 2001.
  19. [19] H. Müller, W. Müller, D. M. Squire, S. Marchand-Maillet, and T. Pun, “Performance evaluation in content-based image retrieval: overview and proposals,” Pattern Recognition Letters, Vol.22, pp. 593-601, April 2001.
  20. [20] X. Jianfeng, K. Haruhisa, and Y. Akio, “Content-Based Retrieval of Motion Capture Data Using Short-Term Feature Extraction,” IEICE Trans. on Information and Systems, Vol.92, No.9, pp. 1657-1667, 2009.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Jun. 03, 2024